A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Resolution limit of image sensor



 
 
Thread Tools Display Modes
  #1  
Old January 6th 07, 07:27 PM posted to rec.photo.digital
Marc Wossner
external usenet poster
 
Posts: 35
Default Resolution limit of image sensor

Hi NG,

Can someone please explain to me if there is a connection between the
Nyquist sampling theorem and the resolution limit of a digital image
sensor? I mean, does it imply something like a lowest mark as far as
pixel spacing is concerned? - I´m quite new to digital photography and
keep reading about this stuff but must admit that it´s by far too
theoretical for me!

Marc Wossner

  #2  
Old January 6th 07, 07:32 PM posted to rec.photo.digital
Cgiorgio
external usenet poster
 
Posts: 219
Default Resolution limit of image sensor

I would assume that sampling theorema apply to post processing (A/D +
following processing) but not to the image sensor itself where quantum
efficiency plays a role. Some transfer functions can be used to characterize
lenses.

"Marc Wossner" schrieb im Newsbeitrag
ups.com...
Hi NG,

Can someone please explain to me if there is a connection between the
Nyquist sampling theorem and the resolution limit of a digital image
sensor? I mean, does it imply something like a lowest mark as far as
pixel spacing is concerned? - I´m quite new to digital photography and
keep reading about this stuff but must admit that it´s by far too
theoretical for me!

Marc Wossner


  #3  
Old January 6th 07, 09:04 PM posted to rec.photo.digital
Stephen M. Dunn
external usenet poster
 
Posts: 58
Default Resolution limit of image sensor

In article . com "Marc Wossner" writes:
$Can someone please explain to me if there is a connection between the
$Nyquist sampling theorem and the resolution limit of a digital image
$sensor?

Yes, there's a connection, but it's not the only factor.

My Canon EOS 20D's sensor has a resolution of 3504x2336 (that's
effective pixels; as with most sensors, there are some additional pixels
that don't actually get used as part of the image data). Nyquist
tells us that this can produce at most half that many line pairs.
That sets an upper limit on resolution for any given sensor.

But there are other factors that come into play. Lenses are
not perfect; they all result in some level of loss of contrast
and/or sharpness. So if you were to quadruple the number of pixels
by doubling the number in each dimension, that wouldn't necessarily
result in images with twice the sharpness or twice the detail, if
for no other reason than that the lens may not be up to the task.

You could also take the same number of pixels and make them
larger. The 20D's sensor is about 22x15mm, so it has approximately
160 pixels per millimeter. The 1D Mark IIN has the same number
of effective pixels but in a ~29x19mm sensor, yielding about
120 pixels per millimeter. So despite the same number of pixels,
a lens that deliver sharper results at 60 lp/mm than at 80 lp/mm
will give you sharper images on the 1D IIN than on the 20D.

$ I mean, does it imply something like a lowest mark as far as
$pixel spacing is concerned?

You can certainly increase the maximum resolution that the sensor
can capture by making the pixels smaller. But then you run into
other problems. Noise is one major problem here. There's a certain
level of background noise. A larger pixel captures more photons,
so the ratio of signal (photons) to noise can be pretty good. A
smaller pixel captures fewer photons, yielding a lower signal to noise
ratio. There's also the issue of Poisson distribution; the arrival
and distribution of photons are random, and even if you take a picture
of a subject which is perfectly even, some pixels will receive a few
more photons than others. Again, in a large pixel, this random variation
is small relative to the total number of photons, while in a small pixel,
this variation can be significant.

If you've ever compared a typical shot at ISO 400 from a compact
digital P&S (which has a relatively small sensor and therefore
tiny pixels) to a typical shot at ISO 400 from a DSLR (which has a
relatively large sensor and therefore large pixels), you'll understand
this in practical terms: the P&S picture is significantly noisier
than the DSLR picture.

There are also issues about data volumes as the number of
pixels rises. An 8 MP camera usually produces JPEGs that are
in the 3-4 MB ballpark. A 16 MP camera would produce files that are
about twice that size. How big a file do you need? How big a file
do you want to have to store? How much flash memory do you want to
have to take on holiday with you in order to store all your pictures?

And speed ... the 1D IIN can take about 8.5 frames per second,
with a resolution of 8.2 MP. The 1Ds II has a 16.7 MP sensor and can
only shoot at about 4 frames per second. It's not because the mechanical
bits can't keep up (both cameras are very similar mechanically, and are
based on a film camera, the 1V, which can get up to 10 fps) or because
Canon's engineers got lazy when designing the 1Ds II; there's
simply too much data to be moved around. There are digital backs for
medium-format cameras which yield tens of megapixels, and they
typically can't even do two frames per second, for the same reason.
--
Stephen M. Dunn
---------------- http://www.stevedunn.ca/ ----------------

------------------------------------------------------------------
Say hi to my cat -- http://www.stevedunn.ca/photos/toby/
  #4  
Old January 6th 07, 09:34 PM posted to rec.photo.digital
Mike Russell
external usenet poster
 
Posts: 408
Default Resolution limit of image sensor

"Marc Wossner" wrote in message
ups.com...
Hi NG,

Can someone please explain to me if there is a connection between the
Nyquist sampling theorem and the resolution limit of a digital image
sensor?


In imaging terms, Nyquist defines maximum possible sharpness in terms of the
spacing between pixels. More pixels equals more sharpness, all else being
equal.

I mean, does it imply something like a lowest mark as far as
pixel spacing is concerned?


Another way to think of it is that - optics aside - the most detail you can
have in an image is every other pixel being black and white, in a
checkerboard pattern. If you attempt to represent finer detail, your image
gets killed by a crazed pattern called moire.

I´m quite new to digital photography and
keep reading about this stuff but must admit that it´s by far too
theoretical for me!


There's a world of mathematical theory behind sharpness: Fourier Transforms,
Modulation Transfer Functions. Some of propeller head types us love this,
but it's a world of hurt if you go in there unwillingly. The rest of us can
ignore it because the following rules of thumb will get the job done, and
only fourth grade arithmetic is required to apply these rules of thumb to
your own images.

Rules of thumb: A razor sharp print is about 320 pixels per inch. Some
people claim they can see more than that - hah. A very acceptably sharp
print is about 200 pixels per inch. At 100 pixels per inch, ordinary people
may be aware of some jagginess in the image. Less than 100 pixels per inch
is generally considered unacceptable but look how great your monitor looks
at even less than 100 pixels per inch. I've gotten away with a 72 pixel per
inch (also know as dot per inch in the trade) it on a few occasions.
Viewing distance matters, so add to this the fact that people stand back
more from a larger print, and you can go even larger than the ppi numbers
would indicate. Check this out yourself by walking up to a billboard - it's
lucky to be one pixel per inch.

Arithmetic: You can ignore megapixels. All we need is the image dimensions
in pixels.

My older camera took (and still takes) a 2048 by 1536 image. Dividing by
320 says a razor sharp image from this camera would be about 6 by 4 inches.
Dividing by 200 gives about a 10 by 8 image. Larger than that, and the
image starts to look a bit fuzzy, though remember that fuzziness is more
acceptable in a larger image.

My newer camera takes a 3204 by 2136 image. Dividing by 320 says that I can
print a razor sharp 8x10 image. Dividing by 200 gives me a pretty darn
sharp 16 by 10 image, and a very nice 19 by 11. In fact, this camera
produces 11 by 19 prints that look wonderful at just under 200 ppi.

BTW - curvemeister class starts tomorrow, Sunday the 7th:
http://www.curvemeister.com/support/class/index.htm
--
Mike Russell
www.curvemeister.com/forum/


  #5  
Old January 6th 07, 10:03 PM posted to rec.photo.digital
Charles Schuler
external usenet poster
 
Posts: 431
Default Resolution limit of image sensor


"Marc Wossner" wrote in message
ups.com...
Hi NG,

Can someone please explain to me if there is a connection between the
Nyquist sampling theorem and the resolution limit of a digital image
sensor? I mean, does it imply something like a lowest mark as far as
pixel spacing is concerned? - I´m quite new to digital photography and
keep reading about this stuff but must admit that it´s by far too
theoretical for me!

As the image detail approaches one-half the spatial sampling frequency,
aliasing starts to become a problem. Aliasing means that artifacts show up
that were not in the scene but were caused by too low of a spatial sampling
frequency or too much image detail. The fix is a blur filter (or anti-alias
filter) mounted on top of the sensor.


  #6  
Old January 7th 07, 12:53 PM posted to rec.photo.digital
Marc Wossner
external usenet poster
 
Posts: 35
Default Resolution limit of image sensor


Thank you all for your exhaustive replies, I understand things much
better now!
So, as I still use silverfilm too, can it be said that the Nyquist
theorem limits it´s resolution in a very fundamental way as well but
is lower because the silver halides are smaller than the individual
pixels?

Marc Wossner

  #7  
Old January 7th 07, 02:00 PM posted to rec.photo.digital
Marc Wossner
external usenet poster
 
Posts: 35
Default Resolution limit of image sensor

Just to check my understanding: If I have a sensor with 3034 horizontal
pixels and a spacing of 7,8 µm it can resolve:

max frequency = scan frequency/2

= 1517 lines in this direction
= structures wich are not closer than 3,9 µm to each other

Is that correct?

Marc Wossner

  #8  
Old January 7th 07, 05:18 PM posted to rec.photo.digital
Don Stauffer in Minnesota
external usenet poster
 
Posts: 464
Default Resolution limit of image sensor


Marc Wossner wrote:
Hi NG,

Can someone please explain to me if there is a connection between the
Nyquist sampling theorem and the resolution limit of a digital image
sensor? I mean, does it imply something like a lowest mark as far as
pixel spacing is concerned? - I´m quite new to digital photography and
keep reading about this stuff but must admit that it´s by far too
theoretical for me!

Marc Wossner


There was much work done on this in the thirties by early TV engineers,
since in the vertical direction image tube TV and kinescopes are
"sampled" systems. Much of the work was done' experimentally, and the
guru was an engineer by the name of Ray Kell. The resulting widely
used value, now called the Kell factor, was around 0.7. That is, for a
system with N samples in a given direction (either vertical or
horizontal) one can resolve about 0.7N lines.

When we got our hands on our first CCD chip at work in the late '70s, I
did an analysis (numerical, sort of Monte Carlo) and found a value
very close to that for mosaic arrays. While it did depend a bit on
fill factor, the dependence wasn't strong. I still use 70% as a good
expectation.

  #9  
Old January 7th 07, 07:34 PM posted to rec.photo.digital
Stephen M. Dunn
external usenet poster
 
Posts: 58
Default Resolution limit of image sensor

In article .com "Marc Wossner" writes:
$So, as I still use silverfilm too, can it be said that the Nyquist
$theorem limits it=B4s resolution in a very fundamental way as well but
$is lower because the silver halides are smaller than the individual
$pixels?

You'll need to find someone who understands film well for a solid
answer on this one. But in any system that uses discrete samples,
Nyquist applies in some fashion. A given silver halide crystal
can't represent both black and white simultaneously, so you need
two of them side by side, one black and one white, and bingo, there's
Nyquist at work again.

But film is more complex than that. On a typical digital sensor,
the sensor sites are all the same size, and are laid out in a
perfectly regular pattern in which they do not overlap. Not so with
film; the crystals are not all the same size, and they are not
laid out regularly, and according to the diagrams I've seen in
film technical data sheets, an emulsion layer is more than one
crystal thick. (For an example, look at Fuji's datasheet for Velvia 50:
http://www.fujifilmusa.com/JSP/fuji/...AF3-960E_1.pdf
and go to page 6.)
--
Stephen M. Dunn
---------------- http://www.stevedunn.ca/ ----------------

------------------------------------------------------------------
Say hi to my cat -- http://www.stevedunn.ca/photos/toby/
  #10  
Old January 7th 07, 07:58 PM posted to rec.photo.digital
John McWilliams
external usenet poster
 
Posts: 6,945
Default Resolution limit of image sensor

Lionel wrote:
On Sat, 6 Jan 2007 12:34:57 -0800, "Mike Russell"
-MOVE wrote:

There's a world of mathematical theory behind sharpness: Fourier Transforms,
Modulation Transfer Functions. Some of propeller head types us love this,
but it's a world of hurt if you go in there unwillingly.


Nicely put.


My reaction, too.

Rules of thumb: A razor sharp print is about 320 pixels per inch. Some
people claim they can see more than that - hah. A very acceptably sharp
print is about 200 pixels per inch. At 100 pixels per inch, ordinary people
may be aware of some jagginess in the image.


Indeed. My eyesight is very good, & I can't tell the difference
between 250 & 300DPI. I can just tell the difference between 200 & 300
DPI - *if* I hold both prints up to my nose in sunlight - but 200 DPI
is plenty sharp enough in every other situation.


Er, Lionel, *ppi*! And I know you know the difference. IAE, it's nice to
see you back posting here on photography.

Now, who can tell the diff between prints printed at 720 dpi vs. 1440
vs. 2880 dpi?

--
John McWilliams

I know that you believe you understood what you think I said, but I'm
not sure you realize that what you heard is not what I meant.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
SENSOR SIZE AND RESOLUTION Wladyslaw Wojciechowski Digital Point & Shoot Cameras 1 November 21st 06 10:29 AM
SENSOR SIZE AND RESOLUTION Wladyslaw Wojciechowski Digital ZLR Cameras 3 November 20th 06 08:45 PM
given file size limit, optimize jpeg level and resolution? peter Digital Photography 8 May 25th 06 01:01 AM
Sensor resolution, any sites with actual measurements? Rich Digital Photography 19 March 8th 06 08:39 AM
sensor size vs resolution freightcar Digital Photography 9 February 5th 06 02:40 AM


All times are GMT +1. The time now is 01:36 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.