View Single Post
  #23  
Old July 22nd 04, 11:42 PM
Gordon Moat
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?

Toralf wrote:

Hi.

I'm still wondering about how good the image quality of modern digital
cameras (especially SLRs) really is, in particular how it compares with
35mm film. I've seen many articles on the subject on the Net, but few of
them seem to give you a lot of tangible information (I want to see the
numbers, please), and I can't help feeling that tests they refer to are
usually done to prove a point, i.e. that digital cameras are as good as
35mm, which is not the way you do proper research.


The reality is that both film and direct digital offer some goods choice to
produce images. I see them more as complimentary devices, rather than an
either/or choice.



To say a few words about myself, I'm working for a company that makes
high-accuracy, large-format scanners, so I'm not particularly impressed
when I hear e.g 6 million pixels (you need to talk about *billions* of
pixels if I'm really going to listen), and the word "interpolation"
leaves a bad taste in my mouth. But this also means I know that high
resolution isn't everything, of course; parameters like geometric
precision or signal-to-noise ratio also count a lot.


Do you work for Creo?



Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway? I think I read
somewhere that a colour negative is at least 3000dpi. Is that correct?
How about black&white? (Yeah I know, a film doesn't have pixels in
exactly the same sense as a digital image, but it *is* made up of
discrete elements after all.)


Okay, it is tough to find numbers that always tell an absolute. If you check
the film data sheets (often in PDF format) from AGFA, Fuji, Ilford, and Kodak,
you will find many films that are capable of 100 lp/mm resolution (or more),
though those are from photographing test targets under controlled conditions.
Similar test target photos as posted to DPReview seem to indicate just under 50
lp/mm for the near 35 mm sized imaging chips (such as the Canon 1Ds and Kodak
14n (also 14c). Some smaller digital chips seem to be able to resolve more
detail than that, though those smaller chips are often just for P&S cameras, so
I will leave those out of the discussion.

Obviously, many of us choose 35 mm photography so we can do hand held shooting.
If you place the camera on a tripod, you increase the captured resolution. If
you use strobes, flash, or other controlled lighting, you can also increase the
resolution. Again, not many of us do that all the time, which means that our
choice of doing photography with a hand held camera will only reduce the
maximum possible resolution. The same choice of use would affect the resolution
of direct digital SLR imagery.

If you only went by the test target lp/mm comparison, then it seems that film
is capable of twice as much resolution, or more, than direct digital imaging.
However, the resolution at the edges of the image seem to always suffer a bit
more, though with the results from some direct digital images seem to indicate
that the falloff at the edges is absent, or at least less severe. That would
imply that direct digital imaging with some devices might give a more even
resolution across an image.

I also think that resolution is a very unfortunate choice of comparison. If you
look at the colour response, then you will find that film is very different.
Any digital imaging sensor using an RGB filter (Bayer pattern, or Foveon), will
have limited response accuracy in colours approaching Cyan, or Yellow. There is
also the problem that white balance seems to be variable in operation when
comparing different systems. You should also read more about interpolation in
imaging chip image processing. One resource that has some great technical
papers about this is:

http://www.fillfactory.com



2. What about the print? 300dpi?


Any print can either make and image, or break an image. There are so many
different printing technologies, that it is tough to put one number on any. If
you have ever heard of an imagesetter, these devices output to film at 2400 or
2540 dpi, using laser technology. A different factor is Dot Gain, which is the
spreading of ink. All inks spread a little, though desktop inkjet systems have
the highest dot gain.

If you are imaging to a desktop inkjet output, then many direct digital choices
might be an easier path. It takes a great deal of knowledge to get the best
results in printing, or at least lots of trial and error.



3. I know that the most common sensors are made up of individual
elements for the read, green and blue channels, arranged in a special
pattern, whose data is somehow interpolated into RGB pixels. But what
exactly does e.g. 6 megapixels mean in that context? Does it mean that
the sensor has (just) 6 million elements, or that data from a higher
number (like 18 or 24 million) is combined into 6 million RGB pixels?


A Bayer pattern is an array of Red, Green, or Blue pixels, often in a repeating
pattern. Most of these Bayer pattern filters are arranged so that there are
twice as many Green pixels as all others. The patent actually dates from around
1978 from Kodak, and the choice of Green dominance was determined then based on
the human eye being able to resolve Green slightly better than Blue or Red.
Again, check out the Fill Factory web site link for more technical information.

Sony with their latest fourth colour Bayer pattern is trying out a different
approach. While these variations have occurred in the past, the previous
results often showed interpolation errors, especially at edges of objects in a
scene. While the Sony attempt is a nice direction, it would seem to need more
work. Read more by searching for "fringing" in Sony based images.



The same question more bluntly put: When Canon/Nikon/Pentax is talking
about 6MP, is that just a big a lie as the one about 10MP on Sigma
cameras? (I'm hoping not, as I think the Sigma/Foveon way of counting
really takes the cake.)


Technically, photographing a mostly Green subject or scene, should show more
detail than photographing something more Red, or more Blue. However, if you
actually look at the largest file size, then the dimensions of the chip in
millimetres, you can figure out the maximum potential resolution. Doing just
that would seem to indicate that some direct digital SLRs are capable of 50 to
58 lp/mm, though they fall short of that. There is also an anti-alias
(anti-moiré) filter, often and IR filter, microlens array (or similar
diffraction layer), and the Bayer pattern to all reduce some of the maximum
potential resolution.

With the Foveon, the potential is that Green, Red, or mostly Blue scenes could
be recorded equally high resolution. Foveon also interpolates the data striking
the imaging chip, though it provides the potential to do better. This is really
an attempt to get away from the Bayer pattern, though they do not have it quite
figured out. Perhaps in another release in the future they may provide better
results. Take a look at the file size and chip dimensions, then you know the
limits currently.



4. Can the inaccuracy associated with the above mentioned interpolation
be quantified and/or measured against e.g. the error introduced by
scanning a negative with a film-scanner? And how does it compare with
pixel interpolation in the scanning sense?


A CCD film scanner uses a trilinear array, often with a fourth white array to
correct all settings. What happens is that there is no colour interpolation of
adjacent pixels, and a pixel value can be set for Red, Green, and Blue at any
one pixel, without interpolation. Film scanners are actually much slower than
direct digital cameras, so this approach works. Some of the high end scanning
backs, or scanning cameras are capable of extremely high resolution and
accuracy, though they were only useful for non moving subject matter. Try this
article for mo

http://www.adamwilt.com/TechDiffs/CCDColor.html Excellent overview of the
technologies.

Anyway, when you read a bit more about this stuff, then you realize that even
getting accurate colour from a film scan could be tough. Also, since a computer
monitor is RGB, even then the colour response of near Cyan, or near pure
Yellow, is impossible to see. One needs to use a colour picker tool (eyedropper
in PhotoShop) to tell if some colours are correct. So depending upon the
scanner, skill of the operator, editing software, and monitor, some might find
that scanned film does not offer an advantage over direct digital imaging.
These skills might be tough to master, meaning that the best choice might be
direct digital.



5. And how about those other parameters I mentioned briefly above - like
different kinds of geometric distortions, noise, flat field bias etc.?
Can those be compared with the ones of plain old film?


Noise can be roughly compared to grain. However, a word of caution here; what
you see on a monitor may never appear on a final print. While it may depend
upon the printing technology in use, quite often there is not visible noise (or
grain) on a print, when it seemed to be visible on a computer monitor.

Okay, so obviously a chip should be flatter than a piece of film, especially as
one goes larger than 35 mm. An imaging chip is actually three dimensional, with
a type of well to capture the light striking each pixel. There are also dead
areas between pixels, though microlens arrays and diffraction layers try to
avoid problems with that.

Film is not linear in construction, and the molecules and grain clumps overlap
on many different layers within the film. Light could sensitize any particular
point, and from really any angle of contact.

Film grains are under 3µ in size. Film imaging chips have an optimum response
at around 6.8µ to 8µ (microns) in size. Too small reduces sensitivity to light,
while too large introduces noise. There are drum scanners that can image at
about 3µ, though they are very new, and not very common. CCD film scanners are
capable of much less than drum scanners.



6. And the chromic aberration effects? How serious are they these days?
And are the full-frame sensors that are actually found in some high-end
cameras now, in any way comparable to film in that respect?


Sure. I suggest looking at the test images from DPReview. Obviously, it is
tough to tell just by looking on a monitor, though you could try printing some
of the images.



Well, maybe some people will say I have a somewhat critical or
conservative attitude towards digital cameras, but I actually think you
ought to be a bit sceptical when something "new and wonderful" comes a
long; new technology is too often introduced for technology's own sake, IMO.

- Toralf


I think the technology is improving all the time. Obviously, the gear available
next year will be better than the gear there is today, and will be priced
lower. I think we will see many more full 35 mm sized imaging chips in the near
future, and I expect that to be the most common SLR in 2008. Film still offers
some great choices, especially for those more concerned with colour (or B/W),
than with outright resolution. Both methods still offer advantages, so I still
think they are complimentary technologies.

Ciao!

Gordon Moat
A G Studio
http://www.allgstudio.com Updated!