Trying to make a comparison between HD (Digital) and Film is actually quite difficult as in essence they are two different mediums. That said I have sat and listened to many people quote that the theoretical resolution of film is 4 or 6 or even 8K, obviously depending on who you discuss/argue it with. While trying to find a way of discussing resolution, I found an interesting comparison between film and HD, hopefully I will have condensed the information into something which may help you think about resolution in a different way - like I did.
Resolution is the visible detail in an image. Since pixels are the smallest point of information in the digital world, it would seem that comparing pixel count is a good way to compare relative resolution and in fact it very much is always touted as the way forward.
Film is analog so there are no real pixels. However, based on converted measures (through scanning), a 35mm frame has 3 to 12 million pixels, depending on the stock, lens, and shooting conditions. An HD frame has 2 million pixels, measured using 1920 x 1080 scan lines. With this difference, even at the lowest recording, 35mm appears vastly superior to HD. This is the argument most film purists use. The truth is, pixels are not the way to compare resolution. The human eye cannot see individual pixels beyond a short distance. What we can see are lines.
Consequently, manufacturers measure the sharpness of photographic images and components using a parameter called Modulation Transfer Function (MTF). This process uses lines (not pixels) as a basis for comparison. Since MTF is an industry standard, we will maintain this standard for comparing HD with 35mm film. In other words, we will make the comparison using lines rather than pixels. Scan lines are the way video images are compared, so it makes sense from this viewpoint, as well.
Standard definition and high definition refer to the amount of scan lines in the video image. Standard definition is 525 horizontal lines for NTSC and 625 lines for PAL. Technically, anything that breaks the PAL barrier of 625 lines could be called high definition. The most common HD resolutions are 720p and 1080i lines.
There is an international study on this issue, called Image Resolution of 35mm Film in Theatrical Presentation. It was conducted by Hank Mahler (CBS, United States), Vittorio Baroncini (Fondazione Ugo Bordoni, Italy), and Mattieu Sintas (CST, France).
In the study, MTF measurements were used to determine the typical resolution of theatrical release prints and answer prints in normal operation, utilizing existing state-of-the-art 35mm film, processing, printing, and projection.
The prints were projected in six movie theaters in various countries, and a panel of experts made the assessments of the projected images using a well defined formula. The results are as follows:
As the study indicates, perceived differences between HD and 35mm film are quickly disappearing. Notice the word perceived is used. This is important because films are not being made for anyone other than audiences, which sometimes we forget while trying to get the technicalities sorted. It is useful to note at the end of the day the theatre average assessment is not greatly higher than Standard Definition which we watch at home presently in the UK - shocked, I was!
At this point, the typical audience cannot see the difference between HD and 35mm. Even hardened professionals have a hard time telling them apart to be fair.
I remember working on some Standard Definition material when an American DOP breezed into the grading suite and promptly peered inquisitively at my monitor and said "wow this high definition material is just looking great these days isn't it!" Myself and the producer just stared blankly at each other and when he left promptly discussed how terrible the NTSC images must be!
This study was based on standard HD with 1080 lines of horizontal resolution. We now have ultra HD with 4,520 lines. Based on this, I'd personally say the debate is moot. 16mm, 35mm, DV, HD and digital acquisition (D20, Varicam, Viper or Red) are all just simply extra tools available to the filmmaker. The question is not which format is best, but rather, which format is best for your project? The answer, of course, is always based on a balance between aesthetic and budgetary considerations.
A Note On Mega Pixels - Uncovered
I read an interesting article on line called 'The Mega Pixel Myth' by Ken Rockwell', it might be found useful when talking about resolution, hopefully I have summarised enough to make sense. If not then click on the link above for a more concise and eloquent explanation.
The megapixel myth was started by camera makers who use the number of megapixels a camera has to fool you into thinking it has something to do with camera quality. They use it because even a tiny linear resolution increase results in a huge total pixel increase. Therefore camera makers can always brag about how much bigger and better this week's camera is, with even negligible improvements.
This gimmick is used by sales people and manufacturers to make you feel as if your current camera is inadequate and needs to be replaced even if the new cameras each year are only slightly better.
One needs at least a doubling of linear resolution or film size to make an obvious improvement. This is the same as quadrupling the megapixels - think of 2K versus 4K for comparison. A simple doubling of megapixels, even if all else remained the same, is very subtle. The factors that truly matter, like color and sharpening algorithms, are far more relevant to how an image looks - the number of megapixels actually has very little to do with how the image looks. Even worse, plenty of lower MP cameras can make better images than poorer cameras with more MP.