A lot has been written about pixels and more is sure to follow. As a Sigma camera user I am aware of the confusion and marketing hype around this term. Does my DP1 have 4.7 mega pixels, or 14? Does my 10 mega pixel Nikon have 10, half of that? a quarter of that? or in reality no pixels at all. And the question many have asked, just how many pixels on film anyway?
I guess it depends upon where one starts and how they chose to count. The funniest thing to me about this is that if the images aren't printed I can't see where it matters at all. To look ultra close at an electronic image, called pixel peeping by many, has no significance other than that assigned by the peepers themselves.
Now when an image is printed then the fullness of the "Negative" comes into play. There is a quality which might be called sharpness, clarity, impact, whatever. The more pixels that go into the process the better the result. Sharpness in lenses is typically measured in line pairs per mm. The principle is easy enough to understand, the better the lens the more it can resolve. So how do pixels and line pairs compare and how many does it take before we can't perceive a difference? I have been under the conception that 5 line pairs per mm makes a sharp print. Ctein in an essay on T.O.P. today argues that the human eye is capable of determining a difference in prints until they exceed 30 line pairs per mm. WHEW! That's a bunch. He further argues that at today's best, printers are capable of delivering half of that or 15lp/mm. 15 lp/mm equates to about 100 megapixels at 8x10 prints. I guess that would be about 40 mega pixels for a Foveon sensor. Still a ton. Kinda puts the argument about 4.7 vs 14 into perspective.
Now the question about film. Some very smart folks have written about the theoretical limits to film. Good essays. Wrong of course, but well written anyway. The limiting factor for film, assuming a perfect exposure with a world class lens (ha) is the scanner. Scanners seem to come in 2 types, drum and CCD. Drum scanners cost a years pay and take a year or so to really learn to use. All others in my opinion are consumer grade CCD scanners. Some are quite expensive, some quite reasonable, all use the same technology and produce acceptable scans but not drum scan quality. My scanner is an Epson V700 which is actually a quite good CCD scanner, flatbed type, reasonably priced. Although marketing rated at up to 6400 dpi, I find that I seldom scan at that level. It just makes the files larger and doesn't seem to add much.
I have a couple of cameras with outstanding lenses and if I use a fine grain film I can actually see an improvement in scan quality all the way up to 6400 dpi. The image with this post was taken with a Contax TVSiii. This is an excellent film "point and shoot" with a world class Zeiss T lens. I usually have this camera loaded with B&W film and in this instance Fuji Acros, a fine grained 100 ASA film. Interestingly I dropped the camera just before entering the restaurant and the image isn't in perfect focus...not that it seems to matter much in this instance.
Details: I test scanned this image at increasing resolution and indeed saw a difference all the way up to 6400 dpi. About 30% of this image was cropped out. The pixel size of this image, 70 % of the negative is 6012 x 6967 pixels. When I do the math that's 41.9 mega pixels. Projecting this out indicates that a full size 35mm negative of this quality would scan to about 60 megapixels...more or less. Further, as scanners get better perhaps there are more pixels to squeeze from this image. Who knows. Film still has things to teach us.
At any rate, on a cold rainy Halloween morning to have this pretty young woman bring me a cup of hot coffee and a smile was wonderful. She was quite pleased with the print I made her. Prints. Good for photographers.