Size

What Resolution Is 35mm Film?

This is a somewhat controversial question, and there are many possible answers. Film is an analog medium, so it doesn't have pixels per se, though film scanners have pixels and a specific resolution. The argument rages within the industry how much is enough to capture the full dynamic range within a 35mm negative to work without compromise? The very fluffy and somewhat evasive answer is that there are in excess of 25 million pixels in a top quality 35mm image, which equates to around 6K, or 6144 x 4668 pixels. Most scanners will scan up to this resolution and they make a perfectly valid argument as to why you should work in such high resolutions but the point is the higher the resolution the harder it is to work.

For example if you are trying to view a 6K image, at best with a 2K output, the 6K image either only has a portion displayed or needs sampling to display the entire frame, so 2 pixels out of every three may be dropped for viewing purposes - is this not defeating the object of such a huge scan?

2K Standard

It has been generally accepted DI work is carried out at 2K or 2048x1556/1536 utilizing 10 bit cineon log space. This is accepted as having enough dynamic range and resolution to produce a very good working print for cinema release. Stepping away from the creative and quality argument for a moment - apologies but in essence size costs! As the resolution increases so does, storage, bandwidth and ultimately time spent on each frame - longer to scan, longer to dust bust, more disk space is needed, faster machines etc the timescale's are exponential.

A 2K cineon/dpx frame 10 bit log requires about 12 Mbytes of data per single frame. 4K requires around 48 Mbytes of data, four time the storage for a single frame alone. The point is 4K is just starting to be handled effectively in real time so that it can work within DI, the cost is still prohibitive because of storage space, even though the argument is that even this is getting cheaper.

Storage - 2K

To store the basic scans for a 2K Digital Intermediate of a 90 minute feature:

  • 1 second = 24 frames
  • 1 min: 24 frames x 60 = 1440 frames per minute
  • 90 min's: 1440 x 90 = 129 600 frames per feature film
  • 12Mb x 129 600 = 1 555 200 Mb or over 1.5 Tb

Therefore to hold the full length of a theorised 90 minute feature, you need in excess of 1.5 Terabytes of disk space. This is without anything applied to the material and being rendered. For example if you dust bust and clean every frame, for safety that will require another 1.5Tb of space and then when you finally grade the material and output a rendered version with all adjustments, to be recorded back to negative, that will also require a further 1.5Tb of space.

So for one theoretical project alone you will need in excess of 4.5Tb of space if you used 2K as a standard scan size. How many other projects will be working in house at the same time? As I would assume that to be cost effective the facility would need to be scanning the next project, while the last film is being output to film and removed from the system. So in a basic theorised workflow you would need minimum disk storage to hold three films, or approximately 15 Terabytes of storage.

Storage - 4K

  • 48Mb x 129 600 = 6 220 800 Mb or over 6.2 Tb

If the same project was carried out at 4K we would need in excess of 18Tb of space to work just this one particular project. If we had multiple projects then we would need 60 Terabytes of available space, the data becomes mind bogglingly complex - ever wondered where the term Data Wrangler appeared in the film credits of recent feature films?

 

High Definition

If we examine just the image size of the frame, High Definition is very close to the resolution of 1:85 after the aspect ratio is taken in to consideration, as you can see from the following representations/comparisons.

Many newer feature films are being shot on HD and after post production the High Definition is being blown up to fit standard projection aspects. This is either achieved during the post stages or can be achieved using the Arri laser which records directly the HD frame in its proposed output. This technique is possibly better as the post house will save valuable disk space as an uncompressed HD frame is only around 9Mb a frame.

It is undoubtedly a cheaper medium and has less restrictions as tapes can hold a lot more footage than a camera negative, plus extortionate lab costs are not needed and ultimately you know you have captured exactly what you need as the tapes are reviewable straight away - HD certainly is very attractive. If it is shot with a film grade in mind, it certainly can look fantastic - films recently certainly have proved HD is a growing medium.

The main problem ultimately to bare in mind is that High Definition is a Linear format and Film is logarithmic - if you recorded the HD data directly to film, or by simply viewing the HD data with a film LUT applied. Therefore a grade or a LUT must be burnt into the data for output to film. There are standard High Definition linear to logarithmic cubes that can be applied to HD data to output directly to film, this with the blow up from source, would be a cheap(er) solution for output to film.

Rick McCallum, a producer on Attack of the Clones, has commented the production spent $16,000 for 220 hours of digital tape. Where a comparable amount of film would have cost approximately $1.8 million! With newer disk based systems such as the Red one or the SI 2K, the cost is even lower, and exact backups can be stored at different locations on different media as well without degenerative loss. However, this does not necessarily indicate the actual cost savings percentage, as the very low incremental cost of shooting additional footage may encourage filmmakers to use far higher shooting ratios with digital. Lower shooting ratios ultimately may save time in the editing suite with less material to view but with the extra footage and flexibility a production may capture monumental or pivotal shots.

Interestingly at the 2009 Academy Award, best cinematography was awarded for a film mainly shot digitally, Slumdog Millionaire. Another nominee, The Curious Case of Benjamin Button, was also shot digitally. James Cameron's Avatar (now the highest grossing movie ever) was shot for 3D utilising the Fusion Camera System, which uses two Sony HD cameras for the left and right eye, pioneered by PACE, a pioneer in 3D.

 

Super 2K

Something I have heard banded around for a few years now is Super 2K and inherently, obviously because of the name and cost, it is better than 2K? Technically, yes I guess it is slightly better than standard 2K but it still is 2K? The point is Super 2K tends to be an over scan, such as at 4K or 6K, then clever algorithms scale the image back down to 2K - the companies offering 'Super 2K', their algorithms have as much myth and legend as the secret ingredients in the spicy covering of a KFC piece of chicken!

I would say that a lot of the sampling is Nyquist? If you really would like to understand the mathematics behind Nyquist Sampling then simply take a look, good luck. If not, it is a very clever way of interpolating or rounding excessive data, such as 3 pixels into one.

Scanning in my humble opinion has come a long way and can offer some pretty incredible images from initial output. For instance Arriscan scans the image twice, what they term as flashing, they flash for a detailed shadow pass and then flash for highlight details. Then both of the images are combined to form a higher dynamic image than a single flash or single scan pass alone. I am very much in favour of this scanning technique.

 

4K Comparison

Obviously 4K is being championed now because many systems are able to playback the material in real time. Also digital cameras such as the Red 4K are 4K. They inherently still have all the same problems associated with digital with dynamic range but careful capture can lend itself very well to the workflow. Although as I have said, it still remains to be a rather slow and costly workflow, many companies simply cannot afford to offer the service and clients are not willing to pay a rate while machines are being used simply to transfer huge masses of data around a system - remember you are paying the hire of not only the system but an operator as well.

 

Generally discussing size and data is difficult to comprehend hopefully the above diagram will demonstrate the difference in sizing?

 

Film is still the opted favourite and scanning at a higher resolution will give far better clarity and definition in the image. I am sure if we never saw the 4K version we would never even worry about 2K and consider it was 'good enough'. I am positive that when 4K becomes the standard format and even home PCs are powerful enough I will be editing this section questioning why we bothered with 2K and when will 8K become the new '4K' - watch this space. For me size is not always necessarily the better ideal, if the image has highlights blown, then the image is difficult to grade and beautify, with film at least I know there will always be some headroom.

 

Arriscan - 2K 'vs' 4K Comparison.