View Single Post
  #13  
Old June 1st 09, 05:28 AM posted to rec.photo.digital,uk.rec.photo.misc
Eric Stevens
external usenet poster
 
Posts: 13,611
Default Could you actually see photos made from RAW files?

On Sun, 31 May 2009 18:58:46 -0800, (Floyd L.
Davidson) wrote:

Eric Stevens wrote:
On Sun, 31 May 2009 16:11:00 -0800,
(Floyd L.
Davidson) set out to change the substance of the discussion by
massively editing the article to which he is responding. He is alos
trying to switch the argument from the relationship of the original
image to the RAW file to the relationship of the original image to the
raw data (which is quite different from the content of the RAW file).


You actually are that dense!


I read the words. I even quoted them down below. In fact, because you
have (surreptitiously) snipped an awful lot of what follows, its only
two paragraphs down.

Exactly. So why are you claiming otherwise? The raw
data set is not changed. But there are multiple,
correct, different sets of rules used to generate an
exact image from the raw data.


I've been talking about the RAW file from the beginning. So too were
you at that time. Remember "Floyd, I suspect you have been smoking
something which is not good for you. Subject to statistical error
limitations, ... The data in the RAW file can't be restructured to
make a different image without changing the data."


The data in the raw file is not restructured.

Your nonsense is still nonsense.


How can you relate a different image in the camera to the one RAW
file?

The sensor locations are hardly irrelevant either. As I
said, at least *nine* of them are used to generate each
pixel in the resulting image, and you can be assured the
location is relevant! It isn't one pixel and then 8
other randomly chosen locations... it's a group of 9
(or more).

So?

So please cease this silliness where you claim the
sensor locations are irrelevant.


For any one camera, its just one of the items which go to the
transformation of the image to the RAW file. The sensor locations are
invariant as is the other camera hardware, and the firmware for that
matter.


So what are you trying to say? The sensor locations *are*
relevant!


Only as part of the transformation algorithm.

You do understand that "firmware" is where the
"software" is, right?

I'm getting the idea that you have a list of buzz words; but no
idea what any of it means.


One of us doesn't seem to.

The signals generated by the sensors are determined by the
rules inherent in the camera's software.


That statement is blatantly false.


You made it false by chopping out the text around it which made clear
what I was talking about. For the benefit of others I had written:

"The sensor locations are irrelevant. The RAW data is derived
from the sensors by rules which are determined by the
manufacturer of the camera. The signals generated by the
sensors are determined by the rules inherent in the camera's
software. As I have already said, there is a one to one
correspondence between the source image and the RAW
file. You don't have a choice of RAW files for a given image.
Nor do you have a choice of images for a given RAW file".

They are determined by rules inherent in the camera's
hardware. The sensor is not manipulated by software
other than clearing it and reading it. A given amount
of light on one sensor locations produces *exactly* the
same output from the sensor regardless of the camera's
software.

I should have said "The signals generated by the sensors are
-interpreted- by the rules inherent in the camera's software". To that
extent they are 'determined'.


This statement is blatantly *different*.

I quoted you exactly above. Now you want to change what
you said.


Its called clarification. I haven't changed the meaning.


They you don't understand what you said.


I thought that was your problem. That's why I clarified it.

Regardless, you are still wrong. The signals from the
sensor are interpreted according to *hardware* and the
resulting data set is written to a RAW file format.
That is what is "interpreted" by software.


Have I ever said otherwise? But so what?


You did say otherwise. Quoted above.


I can't see where. Someone must have accidentally deleted it.

So you now admit that it is not software at all, but a
hard wired hardware transform.


I don't see why you should suddenly try to make that point. In any
case, while I don't know about your camera, I can download an update
for mine. That doesn't sound as though it is all hardware to me.


You aren't going to download an update that changes how the
hardware processes sensor data to generate the "raw data". It's
hard wired. The sensor output is *analog*, and the digital
data is generated by a series of *hardware* devices. About all
the software does is switch the hardware on and off!


First we are not talking about the RAW data. We (should) always have
been talking about the RAW data file. Second, in the case of the Nikon
D300 the update has changed the way in which the raw data from the
sensors have been interpreted and saved to the RAW file.

I assumed a certain level of technical competence from your
background.


That is correct. A few decades working with digital data,
transmission systems (which is what the hardware between the
sensor and the CF card is) and little things like that... :-)


I thought you were a psychologist.

I wouldn't argue with you over what you have just said but I would
like to point out that this discussion has been about the interpolated
data saved in the RAW file.


The raw data saved in the RAW file is not interpolated.


See the last line of ...
http://en.wikipedia.org/wiki/Bayer_filter

"Bryce Bayer's patent called the green photosensors
luminance-sensitive elements and the red and blue ones
chrominance-sensitive elements. He used twice as many
green elements as red or blue to mimic the human eye's
greater resolving power with green light. These elements
are referred to as sensor elements, sensels, pixel sensors,
or simply pixels; sample values sensed by them, after
interpolation, become image pixels."

The raw data does not define one specific image. When
the data is interpolated there is then an image!


And when that data is interpolated and saved in a RAW file then there


That does not happen. (Except of course for the various
thumbnail JPEG images that are embedded in most RAW files.)


Umm...

is only the one image. Run that RAW file backwards through the same
transformation and you end back up with the original image of which
there can only be the one.


It is a one way process and it cannot be precisely
reversed. If for no other reason than what is called
quantization distortion...


Don't come the technical heavy with me! After all, you are the one who
claimed to not understand what I meant by "statistical error
limitations".

However, in addition to that
the JPEG image which results from interpolation simply
does not contain anything like the full amount of
information that was in the RAW file's data. You cannot
reverse the process.


HOW DO I MAKE IT CLEAR THAT FROM THE BEGINNING WE HAVE BEEN TALKING
ABOUT THE RELATIONSHIP BETWEEN THE ORIGINAL IMAGE ON THE SENSOR AND
THE 'RAW' FILE.

Sorry for shouting but I've said the above several times, I've quoted
from the original articls, and you still keep trying to switch to
conversion from RAW to JPG. That's an entirely different question.



Eric Stevens