Workflow Round-up January 2013

I’ve been trying out a lot of different workflows over the past year, based on system specs, software version, firmware version, etc.

Some people probably still think I work with, what I called, the CineForm workflow. But I haven’t used it since last summer, to be honest.

When I started using a dII I was still on a Windows 32 bit system. After a couple of months of working with CinemaDNG footage I choose a Mac Mini with a Promise Pegasus 12 TB Thunderbolt raid 5 hooked up to it, to function as a media archive and was hoping that Apple would soon release a new Mac Pro with Thunderbolt capabilities. I also anticipated a Windows driver for the Ikonoskop ExpressCard Reader.

When I started testing the Mac Mini, I was somewhat amazed by the power that such a silent little block of computing ingenuity could hold. After a couple of weeks I completely switched to the OSX platform, but was still using the CineForm workflow (via Parallels), because of its speed and raw format.

When summer arrived it was quite clear that we wouldn’t be getting a new Mac Pro, nor a Windows driver for the ExpressCard Reader. So I asked a hardware/software savvy friend to build me an editing system, with support for Thunderbolt and running on OSX.

Meanwhile, Adobe had released CS6 with SpeedGrade and for the first time I was able to  play some CinemaDNG footage realtime on the Mac Mini and had insanely detailed control on grading. It made me forget about CineForm raw.

When the Hackintosh was ready in September my workflow got drastically changed for a second time.

Let’s look at a very personal assessment, on the 2 systems, of the three mainly used decoder/transcoder software for ‘developing’ and grading Ikonoskop footage:

I think it’s clear looking at the results above what, for me, at this time is the best tool to do the debayering, grading and transcoding of my CinemaDNG footage. But it’s also clear that it depends on the system you’re using. So on my main editing system I’m now working in DaVince Resolve 9 Lite (and that’s free folks!). When going mobile, I’d rather use SpeedGrade.

I haven’t integrated CineForm in the test because I don’t use it anymore and I didn’t get it to work in Parallels 8/Windows 8. But I tested the GoPro Cineform Studio Premium for Windows just before the summer and I was amazed by the new functionalities that were integrated to debayer raw footage. Dynamic range was much improved and of course it was no longer a command line tool. But sadly enough, no Mac version with the same raw functionalities.

But then there’s John Hable at 19lights.com. I already wrote about his Ginger HDR in my last post, but in the mean time he has been working hard to improve his plug-in and I’m astonished by what he has accomplished in such a short time.

  1. You no longer have to create the .gnr wrappers - So via the plugin you can natively edit CinemaDNG in Premiere Pro!
  2. Ginger HDR supports the (outlandish) audio implementation in Ikonoskop’s CinemaDNG. So you immediately have image AND sound of your CinemaDNG footage in Premiere Pro. This is huge!
  3. It supports timecode!

This is honestly spectacular. There’s still room for improvement (f.i. better CUDA support) and John is working on this. But this is very promising.

Getting rid of Magenta in DaVinci Resolve

UPDATE: obsolete as of firmware 1.27 (March 2013)

There’s a lot of magenta in the shadows of Ikonoskop footage in DaVinci Resolve. I don’t think it has always been that way, but it definitely is the case in Resolve 9 lite.

There are several ways to get rid of it. I offer you one solution here that’s very quick and easy. In my opinion you have to fix it before you do any Color Correction in the Color Tab. That’s why I’ve created an input LUT that fixes the magenta issue: IkonoVinci. Well it will fix it untill Blackmagic changes something on the decoder side and/or Ikonoskop change something on the encoder side. Then I’d might have to make a new one ;-)

This means you don’t have to fiddle about in the CinemaDNG settings (Camera Raw tab) in Resolve. So leave the White Balance, Color Space, Gamma settings and use the Camera Metadata to do the Decoding. 

Unpack the zip and copy the folder into the LUT directory of DaVince Resolve. On OSX this would be: Library>Application Support>Blackmagic Design>DaVinci Resolve>LUT.

Now go to your Project Settings image. Select Look Up Tables and select under 3D Input Lookup Table the IkonoVinci.ilut.

image

Then hit image

Now in your MEDIA tab image you’ll still see the magenta. But when you go to the COLOR tab image, you’ll see it’s gone. Now you can start grading.

A-cam dII & SpeedGrade - Part 1

OK, I haven’t forgotten about this blog!

But I should’ve written this post two months ago. Because I’ve been grading Ikonoskop footage all summer in SpeedGrade and I must say, at this moment, it’s absolutely the right tool to grade A-cam dII CinemaDNG with. 

Because:

  1. To load your footage just point at the folder of your project and select Each subfolder as one sequence (at the top). All your sequences will quickly appear with thumbnails.
  2. In your timeline tab (at the bottom) go to the tab “Color Space Defaults” and select CinemaDNG as your Default color space for file format. And choose the appropriate Preset: Ikonoskop A-Cam Daylight Fluorescent/Ikonoskop A-Cam Daylight/Ikonoskop A-Cam Tungsten

  3. Drag some sequences to your timeline and be amazed. Because it looks good, but most of all it plays realtime, even on something as light as a Mac mini and even after it’s been graded. Straight from the DNG’s.
  4. If you now want to render individual files (for example ProRes mov’s) make sure to select in the Output tab Src.PathElement1 under File name.
    Because this will create mov’s (or avi’s or whatever) that will be named after your sequence folders (for instance AC003002.MOV), which will make linking to your original footage (reels) a breeze.
    If you want something fast to start rendering, select a Half Proxy and Render in Offline Quality (fast). It’s really fast (faster than realtime).
To be continued …

Pomfort planning a CinemaDNG workflow

A while back I wrote to info@digitalbolex.com to ask them about their plans for their CinemaDNG workflow. 

Elle Schneider answered me last week and wrote:

Our current plan is to work with Pomfort, the makers of Silverstack (pomfort.com) to create a custom workflow manager to handle DNG files that could probably be used with the A-cam or any other DNG system.

Pomfort’s answer:

I can’t tell you any details right now, but we are indeed discussing several options for the release of the Digital Bolex.

So stay tuned!

OK, I’m staying tuned …

Batch linking CineForm raw proxies in After Effects

The fastest way to transcode CinemaDNG sequences to an intermediate is via CineForm’s DPX2CF command line tool.

But what if you don’t want to render your final print  from the CineForm raw files? What if you don’t trust anything that has some form of compression?
So what if you want to use CineForm raw only as a proxy and render your final cut from the original CinemaDNG’s using the Adobe Dynamic Link workflow (AE/Pr)? 

I’ve been looking at this in After Effects from all kinds of angels, but can’t seem to find a standard way of linking a folder of proxies (not created in AE) in batch to a bunch of original footage in the AE project. So I fabricated the following:

  1. Transcode your CinemaDNG sequences to CineForm raw AVI/MOV files, make sure to give them a name, followed by a sequential number. In my example: sequence folder AC005001 becomes playtimefilms_test_1.mov
     
  2. Open multiple CinemaDNG sequences in After Effects. I recommend the Immigration script for this.

     

    Just click Import and then hit return every time a sequence gets imported (Camera raw … sigh).
     
  3. Now you get a nicely organized directory structure of your footage. Select the sequences and drag them down to create compositions. 
     
  4. Select the compositions (named Comp, Comp 2, Comp 3, …) and open up the Selected_Comps_Changer script. In the script {Search for} “Comp ” {… and replace with: “name of your proxyfiles_”. Make sure to type a space behind Comp. The script will change all the comp names to the names of your proxies, except for the first one (named Comp). Change this one manually.

     
     
  5. Now select all the comps and use the hacked Create_Proxies script (you know the script I first recommended and then told you to forget about … errrr … it’s back). But first hack it a bit further. I also deleted the part (line 428, 429) where the script actually creates proxies. I don’t want it to create proxies, I just want it to link the CF files to the compositions as proxies.
    In the script dialogue screen select the folder where your CF proxies are stored as Destination  and select an output module you’ve configured to render MOV’s or AVI’s (depending on to what wrapper you’ve transcoded your CF files).

     
     
  6. The script will then create a render queue (still further hackable) and seems to do nothing further. But click on a comp or sequence and you’ll see they have now all a linked proxy.
     

It’s really a fast way of linking in batch a bunch of proxies, that weren’t created in After Effects, to your source footage. But does this make sense? I just can’t believe there’s not more of a standard way built into After Effects to do this kind of stuff. Is there?

Please enlighten me …

Post (traumatic) CinemaDNG workflows?

People are cursing over the “available” CinemaDNG workflows, every day. At this time you can’t edit CinemaDNG sequences in any NLE. So you need to transcode to an intermediate or proxy first.

The tools to transcode CinemaDNG’s to an editable intermediate, I know of and/or have used, are the following:

  • Adobe After Effects (5.5) - to any thinkable format you’ve got the encoder for
  • Blackmagic-Design DaVinci Resolve (lite) (8.2) - I did not get it to work - UPDATE:  Resolve 8.2.1 (beta 1) now does the trick
  • Iridas/Adobe Speedgrade - did not get hold of a demo version to test
  • Cineform’s dpx2cf command line tool - to CineForm raw
  • Adobe Lightroom (4.0) and a whole bunch of other raw stills transcoders - consequence of course is that you have to work with a stills sequence
I’ve been talking to a lot of people about their experiences and this is somewhat the general feeling:
  • After Effects transcodes the CinemaDNG’s too slow
  • Resolve is very fast but does terrible things to colors
  • Speedgrade: very green and difficult to control grading (1 source)
  • CineForm: “yeah we know you’re getting good results, but a command line tool that only works under Windows … really?”
  • Lightroom: great control on grading but it’s a tool for stills/slow workflow/you have to do 1 shot at a time to keep some oversight

Yesterday I did a little test on a Mac Mini i7 with 8GB RAM. The test footage was 59.18 seconds or 1.434 frames long and about 4.5 GB big.

  • After Effects: Render to Apple ProRes 422 (Proxy) took 25min 22sec. What? For less than 1 minute of footage? 
  • Lightroom: Export to JPEG took 7min 31sec. That’s somewhat better.
  • CineForm: Conversion to CineForm raw in 58sec! Now that’s fast!

And that’s why I use CineForm. Even if you don’t want to render your edit from the CineForm raw file, but want to go back to the original CinemaDNG’s, this is still the fastest workflow (using AE/Pr and dynamic link).

Digital Bolex

What exciting news: a 2K CinemaDNG camera for $3.300 and with a roaring name - Bolex. Their Kickstarter project got funded in less than 24 hours.

Looking at their specs and pictures of the prototype I’m making some assumptions:

  1. Kodak sensor KAI-04050 (that’s the dII’s KAI-02150 big brother)?
  2. Camera: a tweaked Allied Prosilica GX2300 ?

A very clever project indeed. I’m not all that convinced by the nostalgic retro design, but that’s just me, and at least it’s some justification for calling it a Bolex.

I think this might be a very good thing for dII users as well. If this camera gets as popular as it seams to be getting, real adoption of CinemaDNG in post software might become a reality.

DANCINEC: de-Bayer software

Some more valuable information on RAW image processing can be found in the comments of Dan Hudgin’s videos on Vimeo. I recognized Dan’s name from some comments on the Ikonoskop forum from quite a while back. He has developed and is still developing his own de-Bayer and Image Processing software (white balance, grading, etc.).

I’ve been testing his software for a while now and still have alot of testing to do. The learning curve for this software is at first quite steap, but I’m getting there (if I find the time).
And it’s definitely worth the effort, because there’s something amazing about the result. But more about this later, as at this time I’m not sure I can trust my eyes.

Anyhow, let’s look at an excerpt of one of Dan’s comments:

As I mentioned in the comments for the #A1 lens test reel, you cannot look at the RAW footage without grading and seem much as it looks like this:

1) Out of focus because there has not been anti-OLPF compensation applied.

2) Very dark almost black at the higher ISO because the camera records sensor linear data, most of the information is in the lower percent of the signal, with two stops of head room, 18% Gray is at about 5% signal depending on the black offset.

3) Almost no color at all, because the Bayer filter in Bayer sensors is de-saturated to allow higher ISO and luma detail in all pixels, the color needs to be increased through a chroma matrix tailored to the sensor type, light source, and subject matter.

4) EI/ISO adjustment curve is absent so the image brightness would vary from one shot to another. Part of the de-Bayer process is to softclip the highlight information to fit within the range of an 8 bit display. The camera shoots 12bits linear data which cannot be displayed on a computer monitor with the slope angle at the same angle as when shooting, to squeeze the tones into a viewable range the RAW data needs to be run through a LUT to correct for the EI/ISO used at the time the images were shot.

5) White balance, the RAW data has a green bias and not white balance, its True RAW data, so no matter what color light you shoot under the sensor is recorded in the same way, so the data levels vary depending on the K value of the light.

Dan is using his software at the moment for tests on a KineRAW-S8 prototype. And you must admit, it looks amazing.

KineRAW-S8p (tm) 2.5K lens test reel B1 from Dan Hudgins on Vimeo.

Lack of Look

A while back I found this tweet:

I’m no longer in love with the Ikonoskop A-Cam DII. That look can be achieved with any camera by lifting the blacks and dropping saturation

At first I just found it some funny statement and thought something like “well serves you right for falling in love with a tool”.

But then it started to annoy me. Because this shows once more the degree of misunderstanding about a camera that shoots in RAW. The point of a camera that shoots in RAW is, in my opinion and I’m sure most will agree, to have NO look, when the images leave the camera!
The point is you can practically go anywhere in developing and grading to get just the look you want in post. So if lifting the blacks and dropping saturation is what you want, you’ve got it! Same with an oversaturated 50’s kind of look or that washed out style that seems so hip in fashion these days. The dII is not about that. It’s about getting the high quality images that give depth to that look you want.

If you see any similarities in the footage that can be found today, it’s rather due to the software that applies the anti-OPLF compensation, does the de-Bayering, handles black off-set, etc. 

Please visit this post (in Spanish) by Manuel López. He has made this great comparison of how the standard settings in different de-Bayer programs interpret the dII footage. Without any grading applied. Great stuff!

 

The still is a frame from Fabrizio Fracassi’s End Slate, a documentary-like short that you can download from Fabrizio’s site