View Single Post
  #8  
Old May 31st 09, 12:10 PM posted to rec.photo.digital,uk.rec.photo.misc
Floyd L. Davidson
external usenet poster
 
Posts: 5,138
Default Could you actually see photos made from RAW files?

Eric Stevens wrote:
On Sat, 30 May 2009 21:32:27 -0800, (Floyd L.
Davidson) wrote:
Eric Stevens wrote:


Floyd has failed to point out that at this point he has omitted a
great deal of previous text. I know he has been around more than long
enough to know that this is not the right thing to do.


I've been posting on Usenet for over 20 years Eric, and
I do know that snipping out the part not being commented
on is the right way to edit an article before posting.

When are *you* going to learn that?

you. Subject to statistical error limitations, there is a one to one
correspondence between the source image and the RAW file. One can be
converted to the other using the rules inherent in the camera's
software.


What do you mean by "the source image"?


That which is projected onto the sensor by the lens.


Then your statements above are patently silly on their face.

The raw data is a sampled set drawn from the projected image.
The projected image cannot ever be recreated in its entirety,
nor can the samples even be recreated with precision.

Perhaps you need to explain what you are trying to
say, preferably without the aid of whatever it is you have snipped
from this article before replying.


I certainly have not used text that isn't here, and
don't see how that would be possible!

Do you know, for example, why the word "interpolation"
(see a good dictionary) is used to describe the process
of converting raw sensor data to an image format?


I know very well what is meant by the word 'interpolation' and the
manner in which this is carried out has nothing whatsoever to do with
the relationship between the source image and the RAW file except to
the extent that it is determined by the rules inherent in the camera's
software.


You said the "source image" is what is projected on the
sensor. Interpolation of course is the method by which
the sensor data is converted to an image format for
viewing.

Two very distinctly separate relationships.

The data in the RAW file can't be restructured to make a
different image without changing the data.


That is not true. In fact there is far more data in a
raw file than is needed to make an image. Likewise it
is possible to emphasize different parts of the data in
different ways to get different images. That is exactly
what white balance is, for example.


And a change in the white balance involves a change in the interpreted
RAW data: i.e. the white balance can only be changed by changing the
raw data.


It is quite possible to change the raw data to effect
while balance, but it isn't normally done that way (Nikon,
for example, has hinted that they might be doing exactly
that in hardware).

What is changed is the interpolation of the data when
creating an image format. The raw data is not changed,
and the raw file stays exactly the same. The way the
data is manipulated during interpolation changes.

Regardless of that, it is rather easy to demonstrate
that the raw data is not changed in order to adjust
white balance. Merely convert a RAW file to a JPEG
image and then, using only the JPEG image (which
contains vastly less data that the RAW file), use an
editor to change the white balance and write a new JPEG
file with a different white balance. The raw data is
not even used, much less changed.

It's also true that while I said that at least 9 sensor
locations are used to generate each pixel, there are
several variations on ways to interpolate the data that
use more than 9. Each method is different.


The sensor locations are irrelevant. The RAW data is derived from the
sensors by rules which are determined by the manufacturer of the
camera.


False. Every different raw converter design uses a
different set of "rules". Coffin's dcraw.c uses one
set, Nikon uses another, and several other raw
converters are different from both of those.

The sensor locations are hardly irrelevant either. As I
said, at least *nine* of them are used to generate each
pixel in the resulting image, and you can be assured the
location is relevant! It isn't one pixel and then 8
other randomly chosen locations... it's a group of 9
(or more).

The signals generated by the sensors are determined by the
rules inherent in the camera's software.


They are determined by rules inherent in the camera's
hardware. The sensor is not manipulated by software
other than clearing it and reading it. A given amount
of light on one sensor locations produces *exactly* the
same output from the sensor regardless of the camera's
software.

As I have already said, there
is a one to one correspondence between the source image and the RAW
file.


You can say that all you like, but it still requires at
least *nine* different sensor locations to generate data
for each pixel of the resulting image. It is not a one
to one relationship.

You don't have a choice of RAW files for a given image. Nor do
you have a choice of images for a given RAW file.


But you have a choice of an infinite number of resulting
images when the camera raw data is interpolated. None
of them are exactly the same as your "source image" that
was projected onto the sensor.

If you understood that, then the above part that you
objected to should have made sense. If not, tell me
what you think it all is and I'll try to explain the
significance in a way that is directed at your response
(rather than at the OP for this thread).


JPEGs, TIFFs and RAW files all contain data required to create an
image.


A JPEG or TIFF contains the data for a single image. A
RAW file contains information for something approaching
an infinite number of images.

The principal difference is that the format of JPEGs and TIFFs
are independent of of the camera which created them while RAW files
are not.


The JPEG and TIFF images are no more, or less,
independent of the camera than the RAW data.

--
Floyd L. Davidson http://www.apaflo.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)