A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Any thoughts /news on Foveon sensors?



 
 
Thread Tools Display Modes
  #181  
Old January 22nd 06, 04:42 PM posted to rec.photo.digital
external usenet poster
 
Posts: n/a
Default Any thoughts /news on Foveon sensors?

In article , says...

Focusing on subcomponents makes a great deal of sense if it helps you to
understand what's going on.


Dave, in the professional world, the specifications for an image usually
are 300 dpi at a specific print size, not 300 dpi of luminance
resolution, 150 dpi of chrominance, 200 dpi of green channel etc. You
need one number which defines the resolution of the image.

For example, the human eye's luminance resolution is about 60 cycles per
degree. If you make a print that, at normal viewing distance, resolves
60 cycles/degree with good contrast, it's going to be indistinguishable
from the original scene in detail. While if the print resolves 30
cycles/degree or 20 cycles/degree (more typical), it will still look
good but not equal to the original scene.


That's all fine, but the problem is that with a Bayer sensor you don't
have accurate luminance information at the single pixel level. In the
RGGB block, the two green pixels give you a somehow good approximation
of luminance (because green is in the middle of the visible spectrum),
but the red and blue pixels do not.

The red pixel for instance might be illuminated by yellow or green
light, and you will not be able to know if the red pixel received a
somewhat less bright red light or light of another colour.

You need to do some interpolation, which in practice means that you can
get a (probably) good enough luminance information, but not at the
*individual* pixel level. In other words, with real world images the
luminance resolution in a Bayer sensor is less than the luminance
resolution in a full colour sensor.

Of course if you use black and white test targets you don't see this
resolution loss, because there is no colour.

snip

Have you seen this:
http://clarkvision.com/imagedetail/r...ens-sharpness/

(look at the coloured boxes on the right side and cry with me)


Why cry? It looks like whenever you can resolve modulation in the
black/white pattern, you can still resolve colour in the patches on the
right. What more could you want?


Did you have a look at figure 2 for instance ? With six pixel wide
lines, you can still clearly see the individual lines in the
monochromatic patterns, but the lines with the same width in the
coloured patterns to the right are hopelessly merged (the three red-
blue-red lines become a homogeneous magenta block, just to make an
example). The blue and green lines are also merged, as are the red and
magenta lines.

In any case, it seems that we differ on the amount of resolution loss
there is in a Bayer sensor (compared to a full colour sensor). My
guesstimate is that on average a Bayer sensor has about 60-70% of the
nominal resolution (the emphasis being on "average" - the loss will
depend on the scene photographed). Not too terrible, but not negligible.


What is "nominal"? Are you comparing against the Nyquist limit of 2
pixels per line pair? No camera can achieve that without aliasing.
Good B&W cameras, with no Bayer filter, don't do more than about 80% of
Nyquist either.


I just mean that with real world images a Bayer CCD with 6MP should have
approx. 4MP of "real" resolution (= 60-70% of 6MP), i.e. the same
resolution of a full colour 4MP sensor (or let's say the same
information content). With an 8MP Bayer sensor you should have around
6MP of "real" pixels. But this is of course just a personal guesstimate.

In any case, since the matter is important I'm surprised there is so
little research on the topic, or at least I have seen very little of it.
--

Alfred Molon
------------------------------
Olympus 50X0, 7070, 8080, E300, E500 forum at
http://groups.yahoo.com/group/MyOlympus/
Olympus E500 resource - http://myolympus.org/E500/
  #182  
Old January 22nd 06, 07:16 PM posted to rec.photo.digital
external usenet poster
 
Posts: n/a
Default Any thoughts /news on Foveon sensors?

Alfred Molon writes:
In article , says...


Focusing on subcomponents makes a great deal of sense if it helps you to
understand what's going on.


Dave, in the professional world, the specifications for an image usually
are 300 dpi at a specific print size, not 300 dpi of luminance
resolution, 150 dpi of chrominance, 200 dpi of green channel etc. You
need one number which defines the resolution of the image.


You're talking about a different meaning of "resolution" - pixel count.
The resolution I'm talking about is a measurement of the ability of an
image to resolve detail. It's unfortunate that the same term gets used
for both.

They are related; the pixel count sets an upper bound on the ability to
resolve detail. But ability to resolve detail can be far below that
upper limit (for example, if the image is out of focus).

However, I don't understand your argument. Every image does have a
single number that describes pixel count, but people also understand
that pixel count alone does not determine ability to resolve detail.
Nor does it mean that colour and B&W resolution are equal.

Nearly every JPEG in existence, even if it came from a 3CCD camera or a
flatbed scanner or a film scanner, has half the horizontal colour
resolution as it does luminance resolution. The best video recorders
also have chroma resolution half that of luma resolution. Yet no one
seems to feel the need to say "the video is 720x480, but the chroma is
only 360x480".

Lower resolution chroma is the norm, not the exception, and the "single
number" resolution figure refers to pixel count

That's all fine, but the problem is that with a Bayer sensor you don't
have accurate luminance information at the single pixel level. In the
RGGB block, the two green pixels give you a somehow good approximation
of luminance (because green is in the middle of the visible spectrum),
but the red and blue pixels do not.


The red pixel for instance might be illuminated by yellow or green
light, and you will not be able to know if the red pixel received a
somewhat less bright red light or light of another colour.


You're ignoring the anti-aliasing filter. If a proper AA filter is
present, a single point in the source image, no matter how small, will
deliver some of its light to a minimum of 4 adjacent pixels on the Bayer
sensor. The pixels simply are not independent in the way your
description assumes.

You need to do some interpolation, which in practice means that you can
get a (probably) good enough luminance information, but not at the
*individual* pixel level. In other words, with real world images the
luminance resolution in a Bayer sensor is less than the luminance
resolution in a full colour sensor.


That's a nice handwaving argument, but (as noted) it doesn't take into
account the AA filter. And just look at the resolution tests of any
recent Bayer-filter camera to see that Bayer sensors *do* resolve about
80% of the theoretical limit in luminance. This is as good as the
luminance resolution of a 3-colour or B&W camera with proper
anti-aliasing.

Of course if you use black and white test targets you don't see this
resolution loss, because there is no colour.


So you're saying that resolution test targets can't be expected to show
the degradation you argue should exist. Do you have an example of
something that does?

Roger Clark's page, which you referenced, has colour-on-white targets as
well as black-on-white and colour-on-colour. By your argument, the
colour-on-white targets should show lower resolution than the
black-on-white targets, because there is colour present. Yet I see no
difference in resolution. Do you?


http://clarkvision.com/imagedetail/r...ens-sharpness/

Did you have a look at figure 2 for instance ? With six pixel wide
lines, you can still clearly see the individual lines in the
monochromatic patterns, but the lines with the same width in the
coloured patterns to the right are hopelessly merged (the three red-
blue-red lines become a homogeneous magenta block, just to make an
example). The blue and green lines are also merged, as are the red and
magenta lines.


But you can clearly see the yellow line standing out from the ones on
either side of it, because it is significantly brighter. So the
*luminance component* of this colour pattern is still resolved just
fine.

The colour differences between similar-luminance colour patches are
blurred together, but you can see that it's only a factor of 2 loss.
The individual colour bars are resolved in the 8-pixel pattern, better
than the luminance in the 4-pixel black on white pattern. So this is
entirely consistent with a Bayer sensor's chroma-only resolution being a
factor of 2 worse than its luminance resolution (although, in these
tests, we can't tell how much of this is due to the sensor and how much
to the lens).

But again, why is this a problem? Your eye's chroma-only resolution is
*ten times worse* than its luminance resolution, and a Bayer camera
delivers chroma resolution 5 times better than that minimum. If you
display Roger's figure 1 on your screen and then back up until you can't
resolve the 4-pixel pattern with your eyes (to roughly match the lens
resolution in figure 2), your eye won't be able to see well-defined
colour bars in the colour-on-colour pattern either.

I just mean that with real world images a Bayer CCD with 6MP should have
approx. 4MP of "real" resolution (= 60-70% of 6MP), i.e. the same
resolution of a full colour 4MP sensor (or let's say the same
information content). With an 8MP Bayer sensor you should have around
6MP of "real" pixels. But this is of course just a personal guesstimate.


Don't mix resolution and "information content". They're not the same.

If you're trying to create a measure based on resolution (the ability
to resolve details), then this measure makes no sense. First, it's not
good to confuse resolution measurements, which are conventionally
linear measurements, with pixel counts, which are roughly proportional
to the square of linear resolution because pixels are an area measure.
Plus you seem to be comparing measured resolution for Bayer cameras
against the theoretical maximum (Nyquist limit) resolution that is not
achievable by any real camera.

For example, a Bayer sensor camera resolves about 80 percent of the
theoretical maximum linear resolution possible for its pixel count.
In theory, this same information could be stored in 64% as many pixels.
So a 6 MP bayer camera could be called equivalent to a 4 MP "optimal"
camera. But a 6 MP B&W senor or a 6 MP 3-CCD camera will *also* only
resolve about 80% of Nyquist (if there is proper AA filtering), so by
your same measure these are *also* only truly 4 MP cameras.

In fact, the only way to get a "true 6 MP" image by your measure is to
shoot or scan at somewhat higher resolution (9 MP or more) and then
digitally downsample to 6 MP using a method with very sharp lowpass
filtering. And this may have such sharp edges as to cause problems
further down the line in processing.

On the other hand, the Bayer sensor image does have less information
content than the 3-CCD colour image because the colour resolution is
lower. But this isn't a very useful comparision measure *because it
doesn't take visibility into account*. Having extra information content
does you no good in photography if you can't see the extra information.

In any case, since the matter is important I'm surprised there is so
little research on the topic, or at least I have seen very little of it.


What do you think is neglected? There's lots of stuff known about the
difference between luminance and chroma response of the human eye, for
example.

Dave
  #183  
Old January 22nd 06, 09:03 PM posted to rec.photo.digital
external usenet poster
 
Posts: n/a
Default Any thoughts /news on Foveon sensors?

In article , says...

You're talking about a different meaning of "resolution" - pixel count.
The resolution I'm talking about is a measurement of the ability of an
image to resolve detail. It's unfortunate that the same term gets used
for both.


Actually I was also referring to that (the ability of an image to
resolve detail, or to put it differently its information content).

Nearly every JPEG in existence, even if it came from a 3CCD camera or a
flatbed scanner or a film scanner, has half the horizontal colour
resolution as it does luminance resolution. The best video recorders
also have chroma resolution half that of luma resolution. Yet no one
seems to feel the need to say "the video is 720x480, but the chroma is
only 360x480".


Huh? In a 3CCD camera the colour resolution should be the same as the
luminance resolution. If JPEG is the problem, then let's use TIF.

You're ignoring the anti-aliasing filter. If a proper AA filter is
present, a single point in the source image, no matter how small, will
deliver some of its light to a minimum of 4 adjacent pixels on the Bayer
sensor. The pixels simply are not independent in the way your
description assumes.


And I remember some time ago you admitted there could be an aliasing
problem, because cameras have AA filters dimensioned for the luminance
resolution, but not for the colour resolution, resulting in colour
aliasing.

The problem is that with a Bayer sensor you have different Nyquist
frequencies for luminance and chrominance, and dimensioning the AA
filter for the lower colour Nyquist frequency would kill luminance
detail.

Of course if you use black and white test targets you don't see this
resolution loss, because there is no colour.


So you're saying that resolution test targets can't be expected to show
the degradation you argue should exist. Do you have an example of
something that does?


Think of a scene where the colour changes between adiacent pixels.

Roger Clark's page, which you referenced, has colour-on-white targets as
well as black-on-white and colour-on-colour. By your argument, the
colour-on-white targets should show lower resolution than the
black-on-white targets, because there is colour present. Yet I see no
difference in resolution. Do you?


It's the colour on colour targets which should show less resolution (as
they in fact do on Roger's page).

http://clarkvision.com/imagedetail/r...ens-sharpness/

Did you have a look at figure 2 for instance ? With six pixel wide
lines, you can still clearly see the individual lines in the
monochromatic patterns, but the lines with the same width in the
coloured patterns to the right are hopelessly merged (the three red-
blue-red lines become a homogeneous magenta block, just to make an
example). The blue and green lines are also merged, as are the red and
magenta lines.


But you can clearly see the yellow line standing out from the ones on
either side of it, because it is significantly brighter. So the
*luminance component* of this colour pattern is still resolved just
fine.


By the way, if you desaturate Roger's image, the colour blocks become
homogeneous grey. If you instead convert to grayscale, you will see that
the yellow line is not that much brighter than the blue or green lines.
I guess what plays a role here is the "distance" between the colours -
yellow for instance being the complimentary of blue.

The colour differences between similar-luminance colour patches are
blurred together, but you can see that it's only a factor of 2 loss.


2 is a lot.

But again, why is this a problem? Your eye's chroma-only resolution is
*ten times worse* than its luminance resolution, and a Bayer camera
delivers chroma resolution 5 times better than that minimum. If you
display Roger's figure 1 on your screen and then back up until you can't
resolve the 4-pixel pattern with your eyes (to roughly match the lens
resolution in figure 2), your eye won't be able to see well-defined
colour bars in the colour-on-colour pattern either.


The question is, how much higher is the information content (or the
ability to resolve detail) in a full-colour sensor with respect to a
Bayer sensor. What do we gain with a full colour sensor ?

With the full-colour sensor there is the full luminance information at
each pixel and there are no aliasing issues caused by different Nyquist
frequencies for chrominance and luminance.

The aliasing or error introduced in Bayer sensor in real world images
has the effect of lowering the capability to resolve detail of the
sensor.

snip

In any case, since the matter is important I'm surprised there is so
little research on the topic, or at least I have seen very little of it.


What do you think is neglected? There's lots of stuff known about the
difference between luminance and chroma response of the human eye, for
example.


I only remember having seen once a paper which tested the loss caused by
the bayer pattern.
--

Alfred Molon
http://www.molon.de - 6500 photos of Asia, Africa and Europe
  #184  
Old January 23rd 06, 03:11 PM posted to rec.photo.digital
external usenet poster
 
Posts: n/a
Default Any thoughts /news on Foveon sensors?

Alfred Molon writes:

Nearly every JPEG in existence, even if it came from a 3CCD camera or a
flatbed scanner or a film scanner, has half the horizontal colour
resolution as it does luminance resolution. The best video recorders
also have chroma resolution half that of luma resolution. Yet no one
seems to feel the need to say "the video is 720x480, but the chroma is
only 360x480".


Huh? In a 3CCD camera the colour resolution should be the same as the
luminance resolution. If JPEG is the problem, then let's use TIF.


The resolution of the data coming off the CCD is the same in colour as
luminance. But if you record the data using DV or D-1, the highest
resolution digital video formats, the chroma is immediately downsampled
horizontally by a factor of 2 before recording. If you use Betacam,
which is also regarded as near broadcast quality, the chroma is
downsampled by a factor of 4 before recording. And most JPEG images use
2:1 chroma downsampling. This works fine *because you can't see the
loss at any reasonable viewing distance*.

JPEG isn't a problem, because it throws away what you can't see.

You're ignoring the anti-aliasing filter. If a proper AA filter is
present, a single point in the source image, no matter how small, will
deliver some of its light to a minimum of 4 adjacent pixels on the Bayer
sensor. The pixels simply are not independent in the way your
description assumes.


And I remember some time ago you admitted there could be an aliasing
problem, because cameras have AA filters dimensioned for the luminance
resolution, but not for the colour resolution, resulting in colour
aliasing.


True, but you were ignoring the AA filter *completely* in your
description of why luminance "could not be accurate". Are you just
trying to find something to complain about, or do you want your
complaints to be accurate?

The problem is that with a Bayer sensor you have different Nyquist
frequencies for luminance and chrominance, and dimensioning the AA
filter for the lower colour Nyquist frequency would kill luminance
detail.


So in practice the AA filter is set up for the pixel pitch, providing
proper anti-aliasing for luminance and allowing some local chroma error.
But, again, if the error is so localized that you can't see it, does
that matter?

So you're saying that resolution test targets can't be expected to show
the degradation you argue should exist. Do you have an example of
something that does?


Think of a scene where the colour changes between adiacent pixels.


Thought experiments that ignore the AA filter, and so are demonstrably
wrong, don't convince me of anything. Do you have (a) a real example of
an image affected by this, or (b) a simulation of the difference between
black/white and colour/white or black/colour cases - either mathematical
or numerical - that shows the effect you suggest?

Roger Clark's page, which you referenced, has colour-on-white targets as
well as black-on-white and colour-on-colour. By your argument, the
colour-on-white targets should show lower resolution than the
black-on-white targets, because there is colour present. Yet I see no
difference in resolution. Do you?


It's the colour on colour targets which should show less resolution (as
they in fact do on Roger's page).


But you previously said that the mere presence of colour would reduce
resolution, so the colour-on-white targets should be affected too.
Right? The colour-on-colour targets show less resolution only in the
case where adjacent bars are similar in luminance - which is the case
where your eye also has drastically reduced resolution.

But you can clearly see the yellow line standing out from the ones on
either side of it, because it is significantly brighter. So the
*luminance component* of this colour pattern is still resolved just
fine.


By the way, if you desaturate Roger's image, the colour blocks become
homogeneous grey. If you instead convert to grayscale, you will see that
the yellow line is not that much brighter than the blue or green lines.


Huh? Conversion to greyscale should give you the same effect as
complete desaturation. If it doesn't, your image editing tool is
implementing one of those operations incorrectly.

I guess what plays a role here is the "distance" between the colours -
yellow for instance being the complimentary of blue.


No, what matters is the luminance. Saturated yellow is very bright,
nearly as bright as full white, while saturated blue is very dark.
Colour bars that differ substantially in luminance remain resolvable
*because the luminance component is resolved* even if the colour is not.
Colour bars that are close in luminance have *only* colour to
distinguish between them, and the colour-only difference is lower in
resolution.

The colour differences between similar-luminance colour patches are
blurred together, but you can see that it's only a factor of 2 loss.


2 is a lot.


But again, why is this a problem? Your eye's chroma-only resolution is
*ten times worse* than its luminance resolution, and a Bayer camera
delivers chroma resolution 5 times better than that minimum. If you
display Roger's figure 1 on your screen and then back up until you can't
resolve the 4-pixel pattern with your eyes (to roughly match the lens
resolution in figure 2), your eye won't be able to see well-defined
colour bars in the colour-on-colour pattern either.


The question is, how much higher is the information content (or the
ability to resolve detail) in a full-colour sensor with respect to a
Bayer sensor. What do we gain with a full colour sensor ?


I'd argue that the important question is "how much higher is the
*visible* information content?". Take both images, convert to a
luma/chroma space like Lab, filter both components using a model of the
eye's MTF at the appropriate viewing distance and print size (thus
removing information the eye cannot see). And *then* compare
information content. For a wide range of normal viewing distances,
there will be *no difference* in visible information content between
Bayer and a 3-CCD sensor, because the small amount of extra
high-frequency information and lower high-frequency chroma errors of the
3CCD sensor are completely invisible to the eye.

With the full-colour sensor there is the full luminance information at
each pixel and there are no aliasing issues caused by different Nyquist
frequencies for chrominance and luminance.


The aliasing or error introduced in Bayer sensor in real world images
has the effect of lowering the capability to resolve detail of the
sensor.


Yes, but why do you care if the difference is invisible?

Dave
  #185  
Old January 23rd 06, 06:24 PM posted to rec.photo.digital
external usenet poster
 
Posts: n/a
Default Any thoughts /news on Foveon sensors?

In article , says...

snip

So in practice the AA filter is set up for the pixel pitch, providing
proper anti-aliasing for luminance and allowing some local chroma error.
But, again, if the error is so localized that you can't see it, does
that matter?


You don't just get a localised colour error. You also get an error in
the luminance, because the luminance is estimated out of incomplete
data. Besides, a large amount of localised errors translate into overall
less resolution.

Think of a scene where the colour changes between adiacent pixels.


Thought experiments that ignore the AA filter, and so are demonstrably
wrong, don't convince me of anything. Do you have (a) a real example of
an image affected by this, or (b) a simulation of the difference between
black/white and colour/white or black/colour cases - either mathematical
or numerical - that shows the effect you suggest?


You just suggested using an AA filter with a pixel pitch, which allows
colour changes betwenn neighbouring pixels, or did I misunderstand you?

snip

But you previously said that the mere presence of colour would reduce
resolution, so the colour-on-white targets should be affected too.
Right?


No, I meant colour-on-colour targets.

snip

By the way, if you desaturate Roger's image, the colour blocks become
homogeneous grey. If you instead convert to grayscale, you will see that
the yellow line is not that much brighter than the blue or green lines.


Huh? Conversion to greyscale should give you the same effect as
complete desaturation. If it doesn't, your image editing tool is
implementing one of those operations incorrectly.


I was also surprised.

I guess what plays a role here is the "distance" between the colours -
yellow for instance being the complimentary of blue.


No, what matters is the luminance. Saturated yellow is very bright,
nearly as bright as full white, while saturated blue is very dark.
Colour bars that differ substantially in luminance remain resolvable
*because the luminance component is resolved* even if the colour is not.
Colour bars that are close in luminance have *only* colour to
distinguish between them, and the colour-only difference is lower in
resolution.


Download Roger's image and convert to greyscale. You'll see that there
is no big brightness difference between the blue, green and yello bars.
The blue in Roger's is quite bright.

snip

I'd argue that the important question is "how much higher is the
*visible* information content?". Take both images, convert to a
luma/chroma space like Lab, filter both components using a model of the
eye's MTF at the appropriate viewing distance and print size (thus
removing information the eye cannot see). And *then* compare
information content. For a wide range of normal viewing distances,
there will be *no difference* in visible information content between
Bayer and a 3-CCD sensor, because the small amount of extra
high-frequency information and lower high-frequency chroma errors of the
3CCD sensor are completely invisible to the eye.


There will still be luminance errors, caused by the incomplete RGB data
of the sensor array, which have the effect to reduce the resolution.

snip

The aliasing or error introduced in Bayer sensor in real world images
has the effect of lowering the capability to resolve detail of the
sensor.


Yes, but why do you care if the difference is invisible?


I don't think it's invisible.
--

Alfred Molon
------------------------------
Olympus 50X0, 7070, 8080, E300, E500 forum at
http://groups.yahoo.com/group/MyOlympus/
Olympus E500 resource - http://myolympus.org/E500/
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
More about cleaning sensors and Canon Canada (long) Celcius Digital Photography 16 December 2nd 05 02:48 PM
Interesting Question about D2X sensors pixby Digital SLR Cameras 25 September 1st 05 10:06 PM
Digital Camera Pricing measekite Digital Photography 75 February 7th 05 10:23 AM
Interesting things about Nikon Chuck Digital Photography 42 July 12th 04 04:28 PM
Why people who don't like Foveon are f*cking idiots Lucas Tam Film & Labs 9 April 14th 04 09:02 PM


All times are GMT +1. The time now is 10:13 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.