Music: "Air Prelude" by Kevin Macleod.

A short and creepy montage of scenes shot around the ever-photogenic island of Manhattan -- filmed entirely in high-dynamic range and comprised of some HDR Timelapse footage I shot, along with a collection of slow-motion and normal 24fps footage processed from Red Epic-X RAW video that I recently captured and then exported as -2,0+2 TIFF stacks to be tone mapped in Photomatix using a batch processing workflow. Please note that none of this was shot using HDRx -- only normal exposures from the camera post-converted into HDR using the traditional faux-HDR method of pushing and pulling the RAW file to create bracketed images.

While HDRx is a powerful tool with a lot of benefits for shooting realistic looking extended dynamic range, I chose to steer clear of it this time in an effort to avoid the motion artifacts that come with it. Especially in light of the fact that I imagine those slight artifacts would have been particularly problematic when working with a more "surreal" method of HDR tone-mapping, as opposed to the more subdued and natural proprietary algorithm Red uses. Also, in this case, the goal was to show the added "pop" you get with HDR video when tone-mapped using a Photomatix detail compressing workflow, while trying to avoid going too far over the top and completely "cracking out" the image.

Please note that my method admittedly has several drawbacks -- namely the grain from the pushed footage is a little excessive at times (a lot at others), and additionally, the push/pull limitations of the RAW file still won't allow me to capture the full dynamic range of an extreme lighting location like Times Square the way I can with DSLR bracketing of many more stops. Thus, billboards are still blown out in some of the shots -- just not as blown out as they would have been in traditional video footage. Additionally, unfortunately in an attempt to mask some of the excessive noise, I took some artistic liberties with noise reduction, and the overall sharpness suffers a bit in several shots. There are also some flickering issues, some related to the high-frame rates I shot at for certain scenes, and others related more to the processing of the HDR itself, since preventing the ugly halos associated with bad HDR is even more tricky with moving footage. I think I did my best under the circumstances, but there are a few shots where halos rear their ugly heads.

To top it off, some of the high-frame rate footage was shot at a higher compression rate (and a few normal 24fps shots where I goofed), and thus the tone-mapped image really brings out some of the artifacts there too. I tried to keep that footage to a minimal, but there were certain shots that I liked compositionally that I chose to include anyway.

Nevertheless, the idea here is to give you an idea of what can be captured with a workflow similar to this, as well as to hint at what might be possible once in-camera HDR technology improves to the point of capturing at least three exposures simultaneously without the added detriment of having to push and pull in post, which as stated before, adds quite a bit of grain.

Thanks, and I hope you enjoy...

Disclaimer: I've received many suggestions as to better and/or more efficient ways to achieve a look very similar to the one I got here. For example, an HDR "effect" can be approximated with inverted channel overlays in color grading suites pretty quickly on the fly, and some of the other characteristics of HDR images -- the "pop" you get from enhancing microcontrast for example, can be more or less re-created with simple filters in other software. That doesn't extend the dynamic range in the same way, but in some cases the camera already has enough DR for the scene anyways, and even if the DR is more than the camera can capture, there are other ways to quickly recover parts of an image in color grading suites as long as you're working with RAW files. I encourage others to try such methods (as well as mine, if you'd like) and come up with their own more efficient workflows. That's the whole point of tests like this. While I still think true local operator tone-mapped HDR is the way to go for processing video to achieve the surreal HDR photo look in motion (what I was going for here), other options may definitely be faster and can no doubt achieve a look that can still approximate it pretty well. In some instances, as with HDRx, they may actually even look more naturally pleasing to those wanting to better replicate a more traditional moving image but still extend the dynamic range. This workflow was ideal for me because I am experienced with Photomatix, but I am by no means trying to pitch it as the definitive way to shoot HDR video. There are many other methods, including HDRx which is a very viable option if you're going for a more realistic look, or even shooting with stereoscopic rigs to capture the individual exposures properly in camera.

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…