A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Digital quality (vs 35mm): Any real answers?



 
 
Thread Tools Display Modes
  #51  
Old July 23rd 04, 12:14 PM
Toralf Lund
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?

Justin Thyme wrote:
"Toralf" wrote in message
...

Hi.

I'm still wondering about how good the image quality of modern digital
cameras (especially SLRs) really is, in particular how it compares with
35mm film. I've seen many articles on the subject on the Net, but few of
them seem to give you a lot of tangible information (I want to see the
numbers, please), and I can't help feeling that tests they refer to are
usually done to prove a point, i.e. that digital cameras are as good as
35mm, which is not the way you do proper research.


Check out http://clarkvision.com/imagedetail/f...digital.1.html
Seems to be not trying to prove a point. In some tests the film wins and in
some the digital wins.

To say a few words about myself, I'm working for a company that makes
high-accuracy, large-format scanners, so I'm not particularly impressed
when I hear e.g 6 million pixels (you need to talk about *billions* of
pixels if I'm really going to listen), and the word "interpolation"
leaves a bad taste in my mouth. But this also means I know that high
resolution isn't everything, of course; parameters like geometric
precision or signal-to-noise ratio also count a lot.


Actually, they mean SFA. I think too often we forget the purpose of
photography - to make images that look good. Who cares if the SN ratio is
crap if the image looks good. Some of my favorite digital photos are as
noisy as all heck, in these the noise added to the photo, not took away from
it.

So, essentially what you are saying is that the noise doesn't matter
anything, but it's somehow still contributing to the picture?


The proof isn't in the technical specifications, the proof is in
whether the photo looks good.

You're missing the point. The point of the technical specification is
actually that it tells you something about whether or not a photo will
generally look good - so you won't have to see a high number of them to
find out. Also, for most people a statistically significant number of
photos just won't be available, so all there is to go by is the numbers
and some highly subjective opinions. Of which I've found a lot of the
latter, while the former is obviously preferrable.

People will tell you that 2MP is no good
above 8x10 - I have a 16x12 photo made from 2MP on the wall that looks fine.
If you walk right up to it you can see some pixelation, but you can't see it
from a normal viewing distance of about 2 feet. Likewise I have images made
from ISO400 consumer grade film that look great too.

Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway? I think I read
somewhere that a colour negative is at least 3000dpi. Is that correct?
How about black&white? (Yeah I know, a film doesn't have pixels in
exactly the same sense as a digital image, but it *is* made up of
discrete elements after all.)


The site I posted above indicates that Fuji Velvia is approximately
equivalent to 15MP in it's resolving power.

OK. Thanks.

[ snip ]
  #52  
Old July 23rd 04, 01:32 PM
Zebedee
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Paul J Gans" wrote in message
...
Zebedee wrote:

"Sabineellen" wrote in message
...
I decided to settle on 3 megapixels. It's adequate for my needs and as

with
slides, I ensure my photos are perfect before I squeeze the button. I

claim
3 megapixels is the perfect equivalent of 35mm for most purposes. 6mp

just
eats up storage space for no visible advantage.

What specific camera did you settle on? I settled on a modest but good

enough
5mp, 'cos i thought if I go for a high end 5mp then i might as well get

an
8mp,
and if i go for an 8mp then I might as well get a dSLR, and if I go for

a
dSLR
I might as well have one of the better one, so it had to stop at one

point.

BTW, it's a wild claim to say that 3mp is "the perfect equivalent" of

35mm
for
most purposes. To print 8x10 at 300dpi you need 7.2 megapixels. 3mp, or

even
2mp, is good enough if you only need them displayed on a monitor. I

personally
display images on a calibrated 21" monitor and find that I really don't

need
prints. I had a home computer in the early 80s and got a PDA many years

ago so
personally I'm well adapted to the paperless existence. In fact, I

really
dislike writing and I'm quite comfortable with typing. So yes, for my

usage,
5mp, or even 2mp, would be adequate, but i wouldn't call it a "perfect
equivalent" to 35mm.


Just why would anybody print at more than 150dpi when that's the maximum

the
eye can see?


Because dpi is often misleading. For some older printers
each color was printed as a separate dot. So 150 dpi meant
that pure red, for instance, is only at 50 dpi.


I wasn't talking about printer dpi. I was talking about image dpi...

--
Yours

Zebedee

(Claiming asylum in an attempt
to escape paying his debts to
Dougal and Florence)



  #53  
Old July 23rd 04, 01:32 PM
Zebedee
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Paul J Gans" wrote in message
...
Zebedee wrote:

"Sabineellen" wrote in message
...
I decided to settle on 3 megapixels. It's adequate for my needs and as

with
slides, I ensure my photos are perfect before I squeeze the button. I

claim
3 megapixels is the perfect equivalent of 35mm for most purposes. 6mp

just
eats up storage space for no visible advantage.

What specific camera did you settle on? I settled on a modest but good

enough
5mp, 'cos i thought if I go for a high end 5mp then i might as well get

an
8mp,
and if i go for an 8mp then I might as well get a dSLR, and if I go for

a
dSLR
I might as well have one of the better one, so it had to stop at one

point.

BTW, it's a wild claim to say that 3mp is "the perfect equivalent" of

35mm
for
most purposes. To print 8x10 at 300dpi you need 7.2 megapixels. 3mp, or

even
2mp, is good enough if you only need them displayed on a monitor. I

personally
display images on a calibrated 21" monitor and find that I really don't

need
prints. I had a home computer in the early 80s and got a PDA many years

ago so
personally I'm well adapted to the paperless existence. In fact, I

really
dislike writing and I'm quite comfortable with typing. So yes, for my

usage,
5mp, or even 2mp, would be adequate, but i wouldn't call it a "perfect
equivalent" to 35mm.


Just why would anybody print at more than 150dpi when that's the maximum

the
eye can see?


Because dpi is often misleading. For some older printers
each color was printed as a separate dot. So 150 dpi meant
that pure red, for instance, is only at 50 dpi.


I wasn't talking about printer dpi. I was talking about image dpi...

--
Yours

Zebedee

(Claiming asylum in an attempt
to escape paying his debts to
Dougal and Florence)



  #54  
Old July 23rd 04, 03:32 PM
MXP
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Michael Scarpitti" skrev i en meddelelse
om...
"MXP" wrote in message

...
All the tests I have seen where 35mm film is compared to a modern DSLR
(6-11MP)...the DSLR pictures shows more detail and less noise than a

fine
grained film like Provia 100F.


That's not so startling. That film is not as sharp as Kodachrome. It's
a poor choice to compare.

Kodachrome 25 was a nice film. But I don't have any. It was the only
Kodachrome.
I don't think Kodachrome 64 is sharper or more fine grain than any of the
100F's.
I want to see a prove of that.

It is quite fustrating that 6MP can beat
35mm.
I know many scanners can do 4000 dpi but if most of the information is
noise?

I still use film and it will be quite interresting to see a test where

e.g.
Provia 100F
shows more detail than an e.g. D1X/D70 or 1Ds/300D.

When I see my slides projected it seems strange that a 6MP DSLR can do
better....

Max


"Toralf" skrev i en meddelelse
...
Hi.

I'm still wondering about how good the image quality of modern digital
cameras (especially SLRs) really is, in particular how it compares

with
35mm film. I've seen many articles on the subject on the Net, but few

of
them seem to give you a lot of tangible information (I want to see the
numbers, please), and I can't help feeling that tests they refer to

are
usually done to prove a point, i.e. that digital cameras are as good

as
35mm, which is not the way you do proper research.

To say a few words about myself, I'm working for a company that makes
high-accuracy, large-format scanners, so I'm not particularly

impressed
when I hear e.g 6 million pixels (you need to talk about *billions* of
pixels if I'm really going to listen), and the word "interpolation"
leaves a bad taste in my mouth. But this also means I know that high
resolution isn't everything, of course; parameters like geometric
precision or signal-to-noise ratio also count a lot.

Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway? I think I read
somewhere that a colour negative is at least 3000dpi. Is that

correct?
How about black&white? (Yeah I know, a film doesn't have pixels in
exactly the same sense as a digital image, but it *is* made up of
discrete elements after all.)

2. What about the print? 300dpi?

3. I know that the most common sensors are made up of individual
elements for the read, green and blue channels, arranged in a special
pattern, whose data is somehow interpolated into RGB pixels. But what
exactly does e.g. 6 megapixels mean in that context? Does it mean that
the sensor has (just) 6 million elements, or that data from a higher
number (like 18 or 24 million) is combined into 6 million RGB pixels?

The same question more bluntly put: When Canon/Nikon/Pentax is talking
about 6MP, is that just a big a lie as the one about 10MP on Sigma
cameras? (I'm hoping not, as I think the Sigma/Foveon way of counting
really takes the cake.)

4. Can the inaccuracy associated with the above mentioned

interpolation
be quantified and/or measured against e.g. the error introduced by
scanning a negative with a film-scanner? And how does it compare with
pixel interpolation in the scanning sense?

5. And how about those other parameters I mentioned briefly above -

like
different kinds of geometric distortions, noise, flat field bias etc.?
Can those be compared with the ones of plain old film?

6. And the chromic aberration effects? How serious are they these

days?
And are the full-frame sensors that are actually found in some

high-end
cameras now, in any way comparable to film in that respect?

Well, maybe some people will say I have a somewhat critical or
conservative attitude towards digital cameras, but I actually think

you
ought to be a bit sceptical when something "new and wonderful" comes a
long; new technology is too often introduced for technology's own

sake,
IMO.

- Toralf



  #55  
Old July 23rd 04, 03:32 PM
MXP
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Michael Scarpitti" skrev i en meddelelse
om...
"MXP" wrote in message

...
All the tests I have seen where 35mm film is compared to a modern DSLR
(6-11MP)...the DSLR pictures shows more detail and less noise than a

fine
grained film like Provia 100F.


That's not so startling. That film is not as sharp as Kodachrome. It's
a poor choice to compare.

Kodachrome 25 was a nice film. But I don't have any. It was the only
Kodachrome.
I don't think Kodachrome 64 is sharper or more fine grain than any of the
100F's.
I want to see a prove of that.

It is quite fustrating that 6MP can beat
35mm.
I know many scanners can do 4000 dpi but if most of the information is
noise?

I still use film and it will be quite interresting to see a test where

e.g.
Provia 100F
shows more detail than an e.g. D1X/D70 or 1Ds/300D.

When I see my slides projected it seems strange that a 6MP DSLR can do
better....

Max


"Toralf" skrev i en meddelelse
...
Hi.

I'm still wondering about how good the image quality of modern digital
cameras (especially SLRs) really is, in particular how it compares

with
35mm film. I've seen many articles on the subject on the Net, but few

of
them seem to give you a lot of tangible information (I want to see the
numbers, please), and I can't help feeling that tests they refer to

are
usually done to prove a point, i.e. that digital cameras are as good

as
35mm, which is not the way you do proper research.

To say a few words about myself, I'm working for a company that makes
high-accuracy, large-format scanners, so I'm not particularly

impressed
when I hear e.g 6 million pixels (you need to talk about *billions* of
pixels if I'm really going to listen), and the word "interpolation"
leaves a bad taste in my mouth. But this also means I know that high
resolution isn't everything, of course; parameters like geometric
precision or signal-to-noise ratio also count a lot.

Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway? I think I read
somewhere that a colour negative is at least 3000dpi. Is that

correct?
How about black&white? (Yeah I know, a film doesn't have pixels in
exactly the same sense as a digital image, but it *is* made up of
discrete elements after all.)

2. What about the print? 300dpi?

3. I know that the most common sensors are made up of individual
elements for the read, green and blue channels, arranged in a special
pattern, whose data is somehow interpolated into RGB pixels. But what
exactly does e.g. 6 megapixels mean in that context? Does it mean that
the sensor has (just) 6 million elements, or that data from a higher
number (like 18 or 24 million) is combined into 6 million RGB pixels?

The same question more bluntly put: When Canon/Nikon/Pentax is talking
about 6MP, is that just a big a lie as the one about 10MP on Sigma
cameras? (I'm hoping not, as I think the Sigma/Foveon way of counting
really takes the cake.)

4. Can the inaccuracy associated with the above mentioned

interpolation
be quantified and/or measured against e.g. the error introduced by
scanning a negative with a film-scanner? And how does it compare with
pixel interpolation in the scanning sense?

5. And how about those other parameters I mentioned briefly above -

like
different kinds of geometric distortions, noise, flat field bias etc.?
Can those be compared with the ones of plain old film?

6. And the chromic aberration effects? How serious are they these

days?
And are the full-frame sensors that are actually found in some

high-end
cameras now, in any way comparable to film in that respect?

Well, maybe some people will say I have a somewhat critical or
conservative attitude towards digital cameras, but I actually think

you
ought to be a bit sceptical when something "new and wonderful" comes a
long; new technology is too often introduced for technology's own

sake,
IMO.

- Toralf



  #56  
Old July 23rd 04, 04:12 PM
Marvin Margoshes
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Toralf" wrote in message
...
Hi.

I'm still wondering about how good the image quality of modern digital
cameras (especially SLRs) really is, in particular how it compares with
35mm film. I've seen many articles on the subject on the Net, but few of
them seem to give you a lot of tangible information (I want to see the
numbers, please), and I can't help feeling that tests they refer to are
usually done to prove a point, i.e. that digital cameras are as good as
35mm, which is not the way you do proper research.

To say a few words about myself, I'm working for a company that makes
high-accuracy, large-format scanners, so I'm not particularly impressed
when I hear e.g 6 million pixels (you need to talk about *billions* of
pixels if I'm really going to listen), and the word "interpolation"
leaves a bad taste in my mouth. But this also means I know that high
resolution isn't everything, of course; parameters like geometric
precision or signal-to-noise ratio also count a lot.

Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway? I think I read
somewhere that a colour negative is at least 3000dpi. Is that correct?
How about black&white? (Yeah I know, a film doesn't have pixels in
exactly the same sense as a digital image, but it *is* made up of
discrete elements after all.)


When Kodak brought out their 13 Mp camera, they said it equals thr
reoslution of 35 mm film. Halation in the emulsion limits the resolution of
film.


2. What about the print? 300dpi?


Unless you have exceptional paper, somewhere around 250 dpi gives the
maximum resolution. The limitation here is the spreading of the ink
droplets.


3. I know that the most common sensors are made up of individual
elements for the read, green and blue channels, arranged in a special
pattern, whose data is somehow interpolated into RGB pixels. But what
exactly does e.g. 6 megapixels mean in that context? Does it mean that
the sensor has (just) 6 million elements, or that data from a higher
number (like 18 or 24 million) is combined into 6 million RGB pixels?


That color data from 6 Mp is converted to 6 Mp. The math is tricky, and
there are many misunderstandings. To convince oyurself, it would be best to
take photos of a resolution test chart.


The same question more bluntly put: When Canon/Nikon/Pentax is talking
about 6MP, is that just a big a lie as the one about 10MP on Sigma
cameras? (I'm hoping not, as I think the Sigma/Foveon way of counting
really takes the cake.)


Hard data on this is hard to come by. I haven't seen any - just arm-waving
arguments.


4. Can the inaccuracy associated with the above mentioned interpolation
be quantified and/or measured against e.g. the error introduced by
scanning a negative with a film-scanner? And how does it compare with
pixel interpolation in the scanning sense?


Again, hard date will trump theory. But the comparisons must be done
carefully to avoid unintended artifacts.

5. And how about those other parameters I mentioned briefly above - like
different kinds of geometric distortions, noise, flat field bias etc.?
Can those be compared with the ones of plain old film?


I think they are better than film. Film is subject to distortion in the
camera and in processing. Anyway, software can correct mst distortions.


6. And the chromic aberration effects? How serious are they these days?
And are the full-frame sensors that are actually found in some high-end
cameras now, in any way comparable to film in that respect?


It seems ot depend on the particular camera. When I had an Olympus 3040, I
saw many complaints about "purple fringing", often misidentified as
chromatic aberration - which is a lens problem. I looked for the fringing
in some of my photos with sharp, high-contrast edges, where the effect is
supposed ot be present. It wasn't there.


Well, maybe some people will say I have a somewhat critical or
conservative attitude towards digital cameras, but I actually think you
ought to be a bit sceptical when something "new and wonderful" comes a
long; new technology is too often introduced for technology's own sake,

IMO.

Digital imaging isn't new anymore. It is routinely used in some of the most
demanding photographic applications, like astronomy.


- Toralf



  #57  
Old July 23rd 04, 06:11 PM
William Graham
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?


"Toralf Lund" wrote in message
...
Stephen H. Westin wrote:
Toralf writes:

[ snip ]



Be that as it may, some of the questions I'd like to have answered are
these:

1. What is the resolution of a 35mm film anyway?



It's hard to say, as the resolution limit is different from that of a
digital sensor. Rather than a hard limit, you get less information and
more blur and noise as you increase resolution in scanning a piece of
film.


I think I read
somewhere that a colour negative is at least 3000dpi. Is that
correct?



It's in the ballpark. [ ... ]

OK. Thanks


That means a 24 x 36 mm sensing plane would need about 12 megapixels to have
the same resolution as film. Digital cameras are not too far from that
now....Perhaps in another couple of years..........


  #58  
Old July 23rd 04, 06:27 PM
Stephen H. Westin
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?

"William Graham" writes:

snip

That means a 24 x 36 mm sensing plane would need about 12 megapixels to have
the same resolution as film. Digital cameras are not too far from that
now....Perhaps in another couple of years..........


Huh? The Kodak almost-14MP DCS Pro 14n shipped over a year ago. And
the DCS Pro SLR/n has replaced it, using an improved sensor. It uses
Nikon-mount lenses, and there is a Canon-mount sibling, the
SLR/c. Several people are using these in lieu of medium-format film
equipment, as they feel the image quality is better.

And medium-format backs reached 16MP some years ago; the best current
single-shot backs have 22MP.

--
-Stephen H. Westin
Any information or opinions in this message are mine: they do not
represent the position of Cornell University or any of its sponsors.
  #59  
Old July 23rd 04, 06:47 PM
scott
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?



"Zebedee" wrote in message

"Sabineellen" wrote in message
...
I decided to settle on 3 megapixels. It's adequate for my needs
and as with slides, I ensure my photos are perfect before I
squeeze the button. I claim 3 megapixels is the perfect
equivalent of 35mm for most purposes. 6mp just eats up storage
space for no visible advantage.


What specific camera did you settle on? I settled on a modest but
good enough 5mp, 'cos i thought if I go for a high end 5mp then i
might as well get an 8mp, and if i go for an 8mp then I might as
well get a dSLR, and if I go for a dSLR I might as well have one
of the better one, so it had to stop at one point.

BTW, it's a wild claim to say that 3mp is "the perfect equivalent"
of 35mm for most purposes. To print 8x10 at 300dpi you need 7.2
megapixels. 3mp, or even 2mp, is good enough if you only need them
displayed on a monitor. I personally display images on a
calibrated 21" monitor and find that I really don't need prints. I
had a home computer in the early 80s and got a PDA many years ago
so personally I'm well adapted to the paperless existence. In
fact, I really dislike writing and I'm quite comfortable with
typing. So yes, for my usage, 5mp, or even 2mp, would be adequate,
but i wouldn't call it a "perfect equivalent" to 35mm.


Just why would anybody print at more than 150dpi when that's the
maximum the eye can see?


Rubbish! It depends how closely you look at the image. My phone has 180dpi
display, and it is pretty easy to see the individual pixels. A 250dpi
display looks *much* smoother. I guess the same goes for prints.


  #60  
Old July 23rd 04, 07:21 PM
Toralf
external usenet poster
 
Posts: n/a
Default Digital quality (vs 35mm): Any real answers?

[ ... ]


3. I know that the most common sensors are made up of individual
elements for the read, green and blue channels, arranged in a special
pattern, whose data is somehow interpolated into RGB pixels. But what
exactly does e.g. 6 megapixels mean in that context? Does it mean that
the sensor has (just) 6 million elements, or that data from a higher
number (like 18 or 24 million) is combined into 6 million RGB pixels?

The same question more bluntly put: When Canon/Nikon/Pentax is talking
about 6MP, is that just a big a lie as the one about 10MP on Sigma
cameras? (I'm hoping not, as I think the Sigma/Foveon way of counting
really takes the cake.)

OK. I've been thinking a bit about the "luminance" argument that has
popped up a number of times (but not a lot, so I may have overlooked
something), and I'm not sure I'm convinced - although it depends a bit
on how you see it.

It seems to me that one of the key weaknesses of the standard "pattern"
sensor is actually that it cannot capture luminance. When you're looking
through a filter, there is simply no way you can tell white light apart
from light of the colour associated with that filter! Furthermore, the
real problem seems not to be photographing a large area in one of the
primary colours like someone suggested, but rather to capture tiny spec
of a distinct colour - especially if that colour is such that one or two
of the components are different from the surrounding data, and the
other(s) the same. For instance, imagine taking a picture of a small
read dot against a completely white background. Now, assume that the
size of that dot's projection onto the sensor matrix is one pixel, and
that it falls on a red pixel sensor. Surely, this will mean that the dot
won't show up at all on the picture - as there is no way the red sensor
can distinguish it from all the white points surrounding it, and the
adjacent green and blue sensors won't "see" it at all. If, on the other
hand, it falls on a e.g. green sensor, it should indeed show up, as that
sensor won't register any light at all, but there is no way of telling
it's exact colour - all you know is that it's not green or colour that
has a green component. (But it may be black or any shade of blue, read
or purple.)

For an exact representation of a dot do be guaranteed, the dot would
have to be of a size equivalent to at least 4 sensor elements (when the
matrix is made up of sets of one red, one blue and two green) - or more
than 3, anyway. By this argument, you could say that the real resolution
is actually the equivalent of 1/4 of the number of pixel sensors, or
1.5MP for a "6MP sensor".

However, I admit that if the projection "touches" two elements instead
of just one - and it has to be larger than one pixel to do that - the
chance of getting the value right is much greater, and if it touches at
least tree, chances are that there is at least one of each colour
channel, in which case a correct colour should be guaranteed. In other
words, it's fair to assume that a size of "two point something" pixels
is sufficient for a correct representation of our little spec.

So, I'd say that for purposes of quality comparisons, a 6MP array has
neither 6 million pixels or the very conservative 1.5, but something in
between. It's actually tempting to guess that the "effective" number of
pixels is somewhere near the 3M the foveon has if you count the way
they'll teach you at most schools except the Sigma marketing academy.
(Which would mean that the sensors are equally good, but that Sigma are
the greatest liers...)

Furthermore, the luminance phenomenon is indeed useful in one sense,
namely that it allows for a good interpolation algorithm - and it does
so *because it isn't registered*. It didn't have to be the luminance, of
course, but I think an essential requirement for reasonable
interpolation is being able to make good assumptions about some
parameter that isn't actually captured.

So, if the 6MP data is generated base on the assumption that the
luminance always makes up most of the light, and that turns out to be
true in a certain case, then maybe you can say that you *really* have
(close to) 6MP *in that particular situation.* Also, the general quality
of data produced by the camera is to a large extent determined by the
probability that the assumption will hold.

By the same token, you may make some kind of assumption about the data
from a foveon sensor or a colour scan that makes you able to interpolate
extra pixels. For instance, you can make the assumption that all pixels
have a value "right between" the adjacent pixels, and get a type of
linear interpolation algorithm. That way you can at least double the
number of pixels and get decent results in most cases, I think.

Based on that, the comparison between the foveon and the other sensors
really boil down to comparing the quality of the interpolation for the
6MP data with the (potential) quality of the interpolating necessary to
get the same number of pixels based on the foveon output - or if you
like, the merits of the assumptions that lie behind the different
interpolations.

Cheers,


T
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Will digital photography ever stabilize? Alfred Molon Digital Photography 37 June 30th 04 08:11 PM
New Leica digital back info.... Barney 35mm Photo Equipment 19 June 30th 04 12:45 AM
Make Professional Quality Posters from Your Digital Images gerry4La Medium Format Photography Equipment 0 June 22nd 04 05:04 AM
Digital Imaging vs. (Digital and Film) Photography Bob Monaghan Medium Format Photography Equipment 9 June 19th 04 05:48 PM


All times are GMT +1. The time now is 04:31 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.