If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#12
|
|||
|
|||
Could you actually see photos made from RAW files?
Eric Stevens wrote:
On Sun, 31 May 2009 16:11:00 -0800, (Floyd L. Davidson) set out to change the substance of the discussion by massively editing the article to which he is responding. He is alos trying to switch the argument from the relationship of the original image to the RAW file to the relationship of the original image to the raw data (which is quite different from the content of the RAW file). You actually are that dense! Exactly. So why are you claiming otherwise? The raw data set is not changed. But there are multiple, correct, different sets of rules used to generate an exact image from the raw data. I've been talking about the RAW file from the beginning. So too were you at that time. Remember "Floyd, I suspect you have been smoking something which is not good for you. Subject to statistical error limitations, ... The data in the RAW file can't be restructured to make a different image without changing the data." The data in the raw file is not restructured. Your nonsense is still nonsense. The sensor locations are hardly irrelevant either. As I said, at least *nine* of them are used to generate each pixel in the resulting image, and you can be assured the location is relevant! It isn't one pixel and then 8 other randomly chosen locations... it's a group of 9 (or more). So? So please cease this silliness where you claim the sensor locations are irrelevant. For any one camera, its just one of the items which go to the transformation of the image to the RAW file. The sensor locations are invariant as is the other camera hardware, and the firmware for that matter. So what are you trying to say? The sensor locations *are* relevant! You do understand that "firmware" is where the "software" is, right? I'm getting the idea that you have a list of buzz words; but no idea what any of it means. The signals generated by the sensors are determined by the rules inherent in the camera's software. That statement is blatantly false. They are determined by rules inherent in the camera's hardware. The sensor is not manipulated by software other than clearing it and reading it. A given amount of light on one sensor locations produces *exactly* the same output from the sensor regardless of the camera's software. I should have said "The signals generated by the sensors are -interpreted- by the rules inherent in the camera's software". To that extent they are 'determined'. This statement is blatantly *different*. I quoted you exactly above. Now you want to change what you said. Its called clarification. I haven't changed the meaning. They you don't understand what you said. Regardless, you are still wrong. The signals from the sensor are interpreted according to *hardware* and the resulting data set is written to a RAW file format. That is what is "interpreted" by software. Have I ever said otherwise? But so what? You did say otherwise. Quoted above. So you now admit that it is not software at all, but a hard wired hardware transform. I don't see why you should suddenly try to make that point. In any case, while I don't know about your camera, I can download an update for mine. That doesn't sound as though it is all hardware to me. You aren't going to download an update that changes how the hardware processes sensor data to generate the "raw data". It's hard wired. The sensor output is *analog*, and the digital data is generated by a series of *hardware* devices. About all the software does is switch the hardware on and off! I assumed a certain level of technical competence from your background. That is correct. A few decades working with digital data, transmission systems (which is what the hardware between the sensor and the CF card is) and little things like that... :-) I wouldn't argue with you over what you have just said but I would like to point out that this discussion has been about the interpolated data saved in the RAW file. The raw data saved in the RAW file is not interpolated. The raw data does not define one specific image. When the data is interpolated there is then an image! And when that data is interpolated and saved in a RAW file then there That does not happen. (Except of course for the various thumbnail JPEG images that are embedded in most RAW files.) is only the one image. Run that RAW file backwards through the same transformation and you end back up with the original image of which there can only be the one. It is a one way process and it cannot be precisely reversed. If for no other reason than what is called quantization distortion... However, in addition to that the JPEG image which results from interpolation simply does not contain anything like the full amount of information that was in the RAW file's data. You cannot reverse the process. -- Floyd L. Davidson http://www.apaflo.com/floyd_davidson Ukpeagvik (Barrow, Alaska) |
#13
|
|||
|
|||
Could you actually see photos made from RAW files?
On Sun, 31 May 2009 18:58:46 -0800, (Floyd L.
Davidson) wrote: Eric Stevens wrote: On Sun, 31 May 2009 16:11:00 -0800, (Floyd L. Davidson) set out to change the substance of the discussion by massively editing the article to which he is responding. He is alos trying to switch the argument from the relationship of the original image to the RAW file to the relationship of the original image to the raw data (which is quite different from the content of the RAW file). You actually are that dense! I read the words. I even quoted them down below. In fact, because you have (surreptitiously) snipped an awful lot of what follows, its only two paragraphs down. Exactly. So why are you claiming otherwise? The raw data set is not changed. But there are multiple, correct, different sets of rules used to generate an exact image from the raw data. I've been talking about the RAW file from the beginning. So too were you at that time. Remember "Floyd, I suspect you have been smoking something which is not good for you. Subject to statistical error limitations, ... The data in the RAW file can't be restructured to make a different image without changing the data." The data in the raw file is not restructured. Your nonsense is still nonsense. How can you relate a different image in the camera to the one RAW file? The sensor locations are hardly irrelevant either. As I said, at least *nine* of them are used to generate each pixel in the resulting image, and you can be assured the location is relevant! It isn't one pixel and then 8 other randomly chosen locations... it's a group of 9 (or more). So? So please cease this silliness where you claim the sensor locations are irrelevant. For any one camera, its just one of the items which go to the transformation of the image to the RAW file. The sensor locations are invariant as is the other camera hardware, and the firmware for that matter. So what are you trying to say? The sensor locations *are* relevant! Only as part of the transformation algorithm. You do understand that "firmware" is where the "software" is, right? I'm getting the idea that you have a list of buzz words; but no idea what any of it means. One of us doesn't seem to. The signals generated by the sensors are determined by the rules inherent in the camera's software. That statement is blatantly false. You made it false by chopping out the text around it which made clear what I was talking about. For the benefit of others I had written: "The sensor locations are irrelevant. The RAW data is derived from the sensors by rules which are determined by the manufacturer of the camera. The signals generated by the sensors are determined by the rules inherent in the camera's software. As I have already said, there is a one to one correspondence between the source image and the RAW file. You don't have a choice of RAW files for a given image. Nor do you have a choice of images for a given RAW file". They are determined by rules inherent in the camera's hardware. The sensor is not manipulated by software other than clearing it and reading it. A given amount of light on one sensor locations produces *exactly* the same output from the sensor regardless of the camera's software. I should have said "The signals generated by the sensors are -interpreted- by the rules inherent in the camera's software". To that extent they are 'determined'. This statement is blatantly *different*. I quoted you exactly above. Now you want to change what you said. Its called clarification. I haven't changed the meaning. They you don't understand what you said. I thought that was your problem. That's why I clarified it. Regardless, you are still wrong. The signals from the sensor are interpreted according to *hardware* and the resulting data set is written to a RAW file format. That is what is "interpreted" by software. Have I ever said otherwise? But so what? You did say otherwise. Quoted above. I can't see where. Someone must have accidentally deleted it. So you now admit that it is not software at all, but a hard wired hardware transform. I don't see why you should suddenly try to make that point. In any case, while I don't know about your camera, I can download an update for mine. That doesn't sound as though it is all hardware to me. You aren't going to download an update that changes how the hardware processes sensor data to generate the "raw data". It's hard wired. The sensor output is *analog*, and the digital data is generated by a series of *hardware* devices. About all the software does is switch the hardware on and off! First we are not talking about the RAW data. We (should) always have been talking about the RAW data file. Second, in the case of the Nikon D300 the update has changed the way in which the raw data from the sensors have been interpreted and saved to the RAW file. I assumed a certain level of technical competence from your background. That is correct. A few decades working with digital data, transmission systems (which is what the hardware between the sensor and the CF card is) and little things like that... :-) I thought you were a psychologist. I wouldn't argue with you over what you have just said but I would like to point out that this discussion has been about the interpolated data saved in the RAW file. The raw data saved in the RAW file is not interpolated. See the last line of ... http://en.wikipedia.org/wiki/Bayer_filter "Bryce Bayer's patent called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the human eye's greater resolving power with green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels." The raw data does not define one specific image. When the data is interpolated there is then an image! And when that data is interpolated and saved in a RAW file then there That does not happen. (Except of course for the various thumbnail JPEG images that are embedded in most RAW files.) Umm... is only the one image. Run that RAW file backwards through the same transformation and you end back up with the original image of which there can only be the one. It is a one way process and it cannot be precisely reversed. If for no other reason than what is called quantization distortion... Don't come the technical heavy with me! After all, you are the one who claimed to not understand what I meant by "statistical error limitations". However, in addition to that the JPEG image which results from interpolation simply does not contain anything like the full amount of information that was in the RAW file's data. You cannot reverse the process. HOW DO I MAKE IT CLEAR THAT FROM THE BEGINNING WE HAVE BEEN TALKING ABOUT THE RELATIONSHIP BETWEEN THE ORIGINAL IMAGE ON THE SENSOR AND THE 'RAW' FILE. Sorry for shouting but I've said the above several times, I've quoted from the original articls, and you still keep trying to switch to conversion from RAW to JPG. That's an entirely different question. Eric Stevens |
#14
|
|||
|
|||
Could you actually see photos made from RAW files?
Eric Stevens wrote:
You actually are that dense! I read the words. I even quoted them down below. In fact, because you have (surreptitiously) snipped an awful lot of what follows, its only two paragraphs down. You still think that I should quote your entire silly article. That is *dense*. So what are you trying to say? The sensor locations *are* relevant! Only as part of the transformation algorithm. No **** Sherlock. Which is to say, yes they are and your statements otherwise were mistaken from the start. You do understand that "firmware" is where the "software" is, right? I'm getting the idea that you have a list of buzz words; but no idea what any of it means. One of us doesn't seem to. And it isn't at all difficult to determine which that would be, Eric. Try, for example, to get a grip on "interpolation" before you continue on with this discussion. Try learning where software is used in the data flow, and where it is a purely hardward process. You made it false by chopping out the text around it which made clear what I was talking about. For the benefit of others I had written: "The sensor locations are irrelevant. The RAW data is derived from the sensors by rules which are determined by the manufacturer of the camera. The signals generated by the sensors are determined by the rules inherent in the camera's software. As I have already said, there is a one to one correspondence between the source image and the RAW file. You don't have a choice of RAW files for a given image. Nor do you have a choice of images for a given RAW file". Yes, and your statement is still false. Explaining a false statement doesn't change the fact that it is false. False: "sensor locations are irrelevant" False: "signals generated by the sensors are determined by ... software" False: "Nor do you have a choicce of images for a given RAW file" Nice paragraph. Regardless, you are still wrong. The signals from the sensor are interpreted according to *hardware* and the resulting data set is written to a RAW file format. That is what is "interpreted" by software. Have I ever said otherwise? But so what? You did say otherwise. Quoted above. I can't see where. Someone must have accidentally deleted it. Now, if you missed where you'd said it before, take a look at what you just repeated. See it now? It's wrong... So you now admit that it is not software at all, but a hard wired hardware transform. I don't see why you should suddenly try to make that point. In any case, while I don't know about your camera, I can download an update for mine. That doesn't sound as though it is all hardware to me. You aren't going to download an update that changes how the hardware processes sensor data to generate the "raw data". It's hard wired. The sensor output is *analog*, and the digital data is generated by a series of *hardware* devices. About all the software does is switch the hardware on and off! First we are not talking about the RAW data. We (should) always have been talking about the RAW data file. The "RAW data file" is merely a file containing the camera raw data. The only part of the file that relates to the image is the data it contains. Which is to say that we *are* talking about the "RAW data", even if you want to call if a "RAW data file". It's the same data either way. Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor (though in other contexts those words might be used to mean the digital data too). "RAW data" clearly must refer to the digital data that goes into the "RAW file". That is the only place where "RAW" is used. (And I often use "raw", simply because "RAW" is grammatically incorrect. They are the same.) Second, in the case of the Nikon D300 the update has changed the way in which the raw data from the sensors have been interpreted and saved to the RAW file. Nice try, but I just read the release notes for Nikon's upgraded firmware for the D300, and saw exactly *nothing* like what you are saying. Provide details, and be specific. I wouldn't argue with you over what you have just said but I would like to point out that this discussion has been about the interpolated data saved in the RAW file. The raw data saved in the RAW file is not interpolated. See the last line of ... http://en.wikipedia.org/wiki/Bayer_filter "Bryce Bayer's patent called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the human eye's greater resolving power with green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels." Didn't you read what that paragraph says???? Are you unable to determine that the "after interpolation" is refering not to generation of data that goes into the RAW file, but rather what is done with data *from* the RAW file in order to make an image (such as TIFF or JPEG). That is what "image pixels" means. The data saved in the RAW file has not yet been interpolated, and when it is interpolated it is *not* saved in the RAW file, and is no longer considered "raw" data. The raw data does not define one specific image. When the data is interpolated there is then an image! And when that data is interpolated and saved in a RAW file then there That does not happen. (Except of course for the various thumbnail JPEG images that are embedded in most RAW files.) Umm... Ummmm..... see above. is only the one image. Run that RAW file backwards through the same transformation and you end back up with the original image of which there can only be the one. It is a one way process and it cannot be precisely reversed. If for no other reason than what is called quantization distortion... Don't come the technical heavy with me! After all, you are the one who claimed to not understand what I meant by "statistical error limitations". I know exactly what *you* meant by that. The point was that your statement was wrong, and you threw in nonsensical statement to make it appear to have significance. Statistical error limitations indeed! However, in addition to that the JPEG image which results from interpolation simply does not contain anything like the full amount of information that was in the RAW file's data. You cannot reverse the process. HOW DO I MAKE IT CLEAR THAT FROM THE BEGINNING WE HAVE BEEN TALKING ABOUT THE RELATIONSHIP BETWEEN THE ORIGINAL IMAGE ON THE SENSOR AND THE 'RAW' FILE. Then don't talk about interpolation and other software processing of the raw data, all of which takes place on data *after* it is placed in the RAW file. Sorry for shouting but I've said the above several times, And then you talk about something different. You don't seem to have even a meager knowledge of the data flow. I've quoted from the original articls, and you still keep trying to switch to conversion from RAW to JPG. That's an entirely different question. Then stop talking about processing the RAW data to make an image. -- Floyd L. Davidson http://www.apaflo.com/floyd_davidson Ukpeagvik (Barrow, Alaska) |
#15
|
|||
|
|||
Could you actually see photos made from RAW files?
On Sun, 31 May 2009 23:44:15 -0800, (Floyd L.
Davidson) wrote: Eric Stevens wrote: You actually are that dense! I read the words. I even quoted them down below. In fact, because you have (surreptitiously) snipped an awful lot of what follows, its only two paragraphs down. You still think that I should quote your entire silly article. That is *dense*. Hell No! You should cut and paste my article to make it mean whatever you would like it to mean. :-( So what are you trying to say? The sensor locations *are* relevant! Only as part of the transformation algorithm. No **** Sherlock. Which is to say, yes they are and your statements otherwise were mistaken from the start. You do understand that "firmware" is where the "software" is, right? I'm getting the idea that you have a list of buzz words; but no idea what any of it means. One of us doesn't seem to. And it isn't at all difficult to determine which that would be, Eric. Try, for example, to get a grip on "interpolation" before you continue on with this discussion. Try learning where software is used in the data flow, and where it is a purely hardward process. YOU try looking up interpolation in the context of the Bayer process. I've done it once already for you. It didn't seem to ring a bell, even the first time. You made it false by chopping out the text around it which made clear what I was talking about. For the benefit of others I had written: "The sensor locations are irrelevant. The RAW data is derived from the sensors by rules which are determined by the manufacturer of the camera. The signals generated by the sensors are determined by the rules inherent in the camera's software. As I have already said, there is a one to one correspondence between the source image and the RAW file. You don't have a choice of RAW files for a given image. Nor do you have a choice of images for a given RAW file". Yes, and your statement is still false. Explaining a false statement doesn't change the fact that it is false. False: "sensor locations are irrelevant" False: "signals generated by the sensors are determined by ... software" False: "Nor do you have a choicce of images for a given RAW file" Look up Bayer interpolation. Nice paragraph. Regardless, you are still wrong. The signals from the sensor are interpreted according to *hardware* and the resulting data set is written to a RAW file format. That is what is "interpreted" by software. Have I ever said otherwise? But so what? You did say otherwise. Quoted above. I can't see where. Someone must have accidentally deleted it. Now, if you missed where you'd said it before, take a look at what you just repeated. See it now? It's wrong... So you now admit that it is not software at all, but a hard wired hardware transform. I don't see why you should suddenly try to make that point. In any case, while I don't know about your camera, I can download an update for mine. That doesn't sound as though it is all hardware to me. You aren't going to download an update that changes how the hardware processes sensor data to generate the "raw data". It's hard wired. The sensor output is *analog*, and the digital data is generated by a series of *hardware* devices. About all the software does is switch the hardware on and off! First we are not talking about the RAW data. We (should) always have been talking about the RAW data file. The "RAW data file" is merely a file containing the camera raw data. The only part of the file that relates to the image is the data it contains. Which is to say that we *are* talking about the "RAW data", even if you want to call if a "RAW data file". It's the same data either way. Nope. What comes out of the sensor is not what is saved in the RAW file. There is a transformation involved. Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor .... .... what analog data? ...(though in other contexts those words might be used to mean the digital data too). I see you have had second thoughts. "RAW data" clearly must refer to the digital data that goes into the "RAW file". That is the only place where "RAW" is used. (And I often use "raw", simply because "RAW" is grammatically incorrect. They are the same.) Second, in the case of the Nikon D300 the update has changed the way in which the raw data from the sensors have been interpreted and saved to the RAW file. Nice try, but I just read the release notes for Nikon's upgraded firmware for the D300, and saw exactly *nothing* like what you are saying. Provide details, and be specific. What do you make of: . Image quality: NEF (RAW ) + JPEG . NEF (RAW) recording: Lossless compressed or Compressed . Image size: S or M That sounds like a change in the way raw data from the sensors have been interpreted and saved to the RAW file. I wouldn't argue with you over what you have just said but I would like to point out that this discussion has been about the interpolated data saved in the RAW file. The raw data saved in the RAW file is not interpolated. See the last line of ... http://en.wikipedia.org/wiki/Bayer_filter "Bryce Bayer's patent called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the human eye's greater resolving power with green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels." Didn't you read what that paragraph says???? Are you unable to determine that the "after interpolation" is refering not to generation of data that goes into the RAW file, but rather what is done with data *from* the RAW file in order to make an image (such as TIFF or JPEG). That is what "image pixels" means. Haw! You really don't understand what Bayer interpolation is all about. The data saved in the RAW file has not yet been interpolated, .... How else do you reckon it is derived from the Bayer mosaic? ... and when it is interpolated it is *not* saved in the RAW file, and is no longer considered "raw" data. Well, at least you understand that much. The raw data does not define one specific image. When the data is interpolated there is then an image! And when that data is interpolated and saved in a RAW file then there That does not happen. (Except of course for the various thumbnail JPEG images that are embedded in most RAW files.) Umm... Ummmm..... see above. is only the one image. Run that RAW file backwards through the same transformation and you end back up with the original image of which there can only be the one. It is a one way process and it cannot be precisely reversed. If for no other reason than what is called quantization distortion... Don't come the technical heavy with me! After all, you are the one who claimed to not understand what I meant by "statistical error limitations". I know exactly what *you* meant by that. The point was that your statement was wrong, and you threw in nonsensical statement to make it appear to have significance. Statistical error limitations indeed! My oath there are statistical error limitations! That you call it 'quantization distortion' doesn't change the fact. However, in addition to that the JPEG image which results from interpolation simply does not contain anything like the full amount of information that was in the RAW file's data. You cannot reverse the process. HOW DO I MAKE IT CLEAR THAT FROM THE BEGINNING WE HAVE BEEN TALKING ABOUT THE RELATIONSHIP BETWEEN THE ORIGINAL IMAGE ON THE SENSOR AND THE 'RAW' FILE. Then don't talk about interpolation and other software processing of the raw data, all of which takes place on data *after* it is placed in the RAW file. Dingbat - interpolation is an assential part of going from the Bayer array to the RAW data file. Please don't continue to pretend otherwise. Sorry for shouting but I've said the above several times, And then you talk about something different. You don't seem to have even a meager knowledge of the data flow. I've quoted from the original articls, and you still keep trying to switch to conversion from RAW to JPG. That's an entirely different question. Then stop talking about processing the RAW data to make an image. I haven't been. If anything I've been talking about working backwards from the RAW data file to reconstruct the original image. Eric Stevens |
#16
|
|||
|
|||
Could you actually see photos made from RAW files?
In article , Eric Stevens
wrote: The "RAW data file" is merely a file containing the camera raw data. The only part of the file that relates to the image is the data it contains. Which is to say that we *are* talking about the "RAW data", even if you want to call if a "RAW data file". It's the same data either way. Nope. What comes out of the sensor is not what is saved in the RAW file. There is a transformation involved. what type of transformation and from what to what? depending on the camera, there may be minor changes such as analog white balance or noise reduction, but for all intents the data in the raw file *is* the data off the sensor, at least with bayer sensors. Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor .... ... what analog data? from the sensor, before the a/d converter. Second, in the case of the Nikon D300 the update has changed the way in which the raw data from the sensors have been interpreted and saved to the RAW file. Nice try, but I just read the release notes for Nikon's upgraded firmware for the D300, and saw exactly *nothing* like what you are saying. Provide details, and be specific. What do you make of: . Image quality: NEF (RAW ) + JPEG . NEF (RAW) recording: Lossless compressed or Compressed . Image size: S or M That sounds like a change in the way raw data from the sensors have been interpreted and saved to the RAW file. no it doesn't. the first is embedding the jpeg in addition to the raw data and the second is how it's compressed. the third is for the size of the jpeg file. raw files are always full size, with canon's sraw being an exception (and since this is a nikon d300, not applicable). I wouldn't argue with you over what you have just said but I would like to point out that this discussion has been about the interpolated data saved in the RAW file. The raw data saved in the RAW file is not interpolated. See the last line of ... http://en.wikipedia.org/wiki/Bayer_filter "Bryce Bayer's patent called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the human eye's greater resolving power with green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels." Didn't you read what that paragraph says???? Are you unable to determine that the "after interpolation" is refering not to generation of data that goes into the RAW file, but rather what is done with data *from* the RAW file in order to make an image (such as TIFF or JPEG). That is what "image pixels" means. Haw! You really don't understand what Bayer interpolation is all about. if anyone doesn't understand it, it's you. nowhere in what *you* quoted says the data in the raw *file* is interpolated. the interpolation is done in the raw converter on the computer, long after the raw file has been created. The data saved in the RAW file has not yet been interpolated, .... How else do you reckon it is derived from the Bayer mosaic? the data in the raw file is *before* the interpolation is done to demosaic the image. ... and when it is interpolated it is *not* saved in the RAW file, and is no longer considered "raw" data. Well, at least you understand that much. odd, because that contradicts what you've been saying. However, in addition to that the JPEG image which results from interpolation simply does not contain anything like the full amount of information that was in the RAW file's data. You cannot reverse the process. HOW DO I MAKE IT CLEAR THAT FROM THE BEGINNING WE HAVE BEEN TALKING ABOUT THE RELATIONSHIP BETWEEN THE ORIGINAL IMAGE ON THE SENSOR AND THE 'RAW' FILE. Then don't talk about interpolation and other software processing of the raw data, all of which takes place on data *after* it is placed in the RAW file. Dingbat - interpolation is an assential part of going from the Bayer array to the RAW data file. Please don't continue to pretend otherwise. no need to pretend otherwise since that's totally incorrect. I've quoted from the original articls, and you still keep trying to switch to conversion from RAW to JPG. That's an entirely different question. Then stop talking about processing the RAW data to make an image. I haven't been. If anything I've been talking about working backwards from the RAW data file to reconstruct the original image. that doesn't make any sense. |
#17
|
|||
|
|||
Could you actually see photos made from RAW files?
Eric Stevens wrote:
On Sun, 31 May 2009 23:44:15 -0800, (Floyd L. Davidson) wrote: Eric Stevens wrote: You actually are that dense! I read the words. I even quoted them down below. In fact, because you have (surreptitiously) snipped an awful lot of what follows, its only two paragraphs down. You still think that I should quote your entire silly article. That is *dense*. Hell No! You should cut and paste my article to make it mean whatever you would like it to mean. :-( I have not changed a single character of the silly things you've written, and in no way have I ever changed the meaning of a single sentence. Please do not blame me for what you write! Again though, it's time that you learned that proper Usenet netiquette is to trim the quoted text to only that require for context. I do that. You don't. And it isn't at all difficult to determine which that would be, Eric. Try, for example, to get a grip on "interpolation" before you continue on with this discussion. Try learning where software is used in the data flow, and where it is a purely hardward process. YOU try looking up interpolation in the context of the Bayer process. I've done it once already for you. It didn't seem to ring a bell, even the first time. Such as where you quoted Wikipedia, and didn't understand what it said! As noted just 5 or so lines above, learn something about the data flow. You made it false by chopping out the text around it which made clear what I was talking about. For the benefit of others I had written: "The sensor locations are irrelevant. The RAW data is derived from the sensors by rules which are determined by the manufacturer of the camera. The signals generated by the sensors are determined by the rules inherent in the camera's software. As I have already said, there is a one to one correspondence between the source image and the RAW file. You don't have a choice of RAW files for a given image. Nor do you have a choice of images for a given RAW file". Yes, and your statement is still false. Explaining a false statement doesn't change the fact that it is false. False: "sensor locations are irrelevant" False: "signals generated by the sensors are determined by ... software" False: "Nor do you have a choicce of images for a given RAW file" Look up Bayer interpolation. And you'll find: 1) that sensor locations are relevant; 2) that the signals generated by the sensors are not determined by software, as that is a purely hardware domain; 3) that the RAW file is data which has not been interpolated, and when it is interpolated there are many choices for a given set of raw data. Note that interpolation is not what generates the data in a "RAW file", it is what generates a TIFF or JPEG formatted image file. Remember that you wanted to only talk about RAW data, not the JPEG... but here you are once again discussing the processing of data to produce an image file... Hmmm... right here it is: First we are not talking about the RAW data. We (should) always have been talking about the RAW data file. The "RAW data file" is merely a file containing the camera raw data. The only part of the file that relates to the image is the data it contains. Which is to say that we *are* talking about the "RAW data", even if you want to call if a "RAW data file". It's the same data either way. Nope. What comes out of the sensor is not what is saved in the RAW file. There is a transformation involved. That is a hardware transformation, not one that is called interpolation and not one that is done with software. Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor .... ... what analog data? The sensor is an analog device. ...(though in other contexts those words might be used to mean the digital data too). I see you have had second thoughts. Neither of us has used it in that context, and since you have repeatedly confused various parts of the data flow it is absolutely essential to differentiate the analog sensor data from the digital data output of the ADC. In many contexts the sensor, the ISO amplifiers, and the ADC are all considered "the sensor" in order to simplify a discussion that really does not involve them other than as a unit. This is clearly not one of those discussions. "RAW data" clearly must refer to the digital data that goes into the "RAW file". That is the only place where "RAW" is used. (And I often use "raw", simply because "RAW" is grammatically incorrect. They are the same.) Second, in the case of the Nikon D300 the update has changed the way in which the raw data from the sensors have been interpreted and saved to the RAW file. Nice try, but I just read the release notes for Nikon's upgraded firmware for the D300, and saw exactly *nothing* like what you are saying. Provide details, and be specific. What do you make of: . Image quality: NEF (RAW ) + JPEG . NEF (RAW) recording: Lossless compressed or Compressed . Image size: S or M That sounds like a change in the way raw data from the sensors have been interpreted and saved to the RAW file. It looks like the specifications from the User Manual to me. Where's the change? Granted though that they *could* upgrade the firmware with a different RAW file data format. I can see how *you* would call that a change in the data, but in fact it isn't. It is a change in the way the data is represented and stored in the file, but the *value* of the data remains the same. Which is to say that the data values which a raw converter will use for interpolation will not be different. That processing is merely encoding of values, not a form of processing that affects the data or the images that are eventually produced. The raw data saved in the RAW file is not interpolated. See the last line of ... http://en.wikipedia.org/wiki/Bayer_filter "Bryce Bayer's patent called the green photosensors luminance-sensitive elements and the red and blue ones chrominance-sensitive elements. He used twice as many green elements as red or blue to mimic the human eye's greater resolving power with green light. These elements are referred to as sensor elements, sensels, pixel sensors, or simply pixels; sample values sensed by them, after interpolation, become image pixels." Didn't you read what that paragraph says???? Are you unable to determine that the "after interpolation" is refering not to generation of data that goes into the RAW file, but rather what is done with data *from* the RAW file in order to make an image (such as TIFF or JPEG). That is what "image pixels" means. Haw! You really don't understand what Bayer interpolation is all about. Haw! You have clearly misread the text you quoted! Hilarious again! The RAW data file does not contain data that has been interpolated (de-mosaiced). The data saved in the RAW file has not yet been interpolated, .... How else do you reckon it is derived from the Bayer mosaic? The raw data in the RAW file is derived using a rather standard Analog-to-Digital Converter. Between the sensor and the ADC there are analog amplifiers with programmable gain, which are used to set the ISO sensitivity. The RAW file contains data that has not yet been de-mosaiced. That is a software process that is used to generate an image format, such as JPEG. (Remember that? The process you didn't want to be discussed?) ... and when it is interpolated it is *not* saved in the RAW file, and is no longer considered "raw" data. Well, at least you understand that much. But you don't, and go on to claim that it is: Dingbat - interpolation is an assential part of going from the Bayer array to the RAW data file. Please don't continue to pretend otherwise. You don't have a clue. Interpolation has *nothing* to do with going from the Bayer array to the RAW data file. It has to do with going from the raw data to an image format, which as noted above is *no* *longer* *called* *raw* *data*. It's usually either a TIFF or a JPEG data set. That is *not* the data in a RAW data file. Then stop talking about processing the RAW data to make an image. I haven't been. If anything I've been talking about working backwards from the RAW data file to reconstruct the original image. Every time you mention "interpolation" or hint at the de-mosaic process you are talking about processing the RAW data to make an image. It is *never* a reconstruction of the original image, because that simply does not happen. -- Floyd L. Davidson http://www.apaflo.com/floyd_davidson Ukpeagvik (Barrow, Alaska) |
#18
|
|||
|
|||
Could you actually see photos made from RAW files?
In article , Bob Larter
wrote: He's talking about the *process* of converting from the RAW image to the RGB image that you see on your screen, which includes Bayer deconvolution. As he says, there is no 1:1 relationship between a pixel ("sensel") on the image sensor & a pixel on the RGB image that you see on your screen. there is a 1:1 mapping of senselsixels, although multiple sensels are used to calculate one pixel. in other words, 10 million sensels on the sensor- 10 million pixels in the image. |
#19
|
|||
|
|||
Could you actually see photos made from RAW files?
In article , Bob Larter
wrote: TTBOMK, the only transformation is the A2D conversion. And that lack of transformations is, after all, the whole point of the RAW file format in the first place. basically that's true, however, nikon did apply white balance to the raw data in some cameras before writing it to the file (d1 series, if i recall). i doubt that's what he meant, and as far as i know, it's no longer done. not to sidetrack, but the only cameras that actually do a transform are sigma/foveon. the raw data in a sigma raw file is *not* what came off the sensor and has gone through quite a bit of processing before being written to the file (which is kinda funny, given the crazy claims about it being 'true colour'). |
#20
|
|||
|
|||
Could you actually see photos made from RAW files?
In article , Bob Larter
wrote: He's talking about the *process* of converting from the RAW image to the RGB image that you see on your screen, which includes Bayer deconvolution. As he says, there is no 1:1 relationship between a pixel ("sensel") on the image sensor & a pixel on the RGB image that you see on your screen. there is a 1:1 mapping of senselsixels, although multiple sensels are used to calculate one pixel. in other words, 10 million sensels on the sensor- 10 million pixels in the image. That's the default, but there's no mathematical necessity for the number of output pixels to equal the number of output pixels. i would hope the number of output pixels equals the number of output pixels unless you upscale or downscale, the number of input sensels will be the same as the number of output pixels. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Could you actually see photos made from RAW files? | mcdonaldREMOVE TO ACTUALLY REACH [email protected] | Digital Photography | 33 | June 3rd 09 07:32 AM |
Could you actually see photos made from RAW files? | Savageduck[_2_] | Digital Photography | 8 | June 1st 09 04:22 AM |
Could you actually see photos made from RAW files? | Steven Green[_3_] | Digital Photography | 0 | May 30th 09 09:27 PM |
Could you actually see photos made from RAW files? | nospam | Digital Photography | 0 | May 30th 09 09:18 PM |
Could you actually see photos made from RAW files? | Trev | Digital Photography | 0 | May 30th 09 09:18 PM |