If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#41
|
|||
|
|||
Could you actually see photos made from RAW files?
|
#43
|
|||
|
|||
Could you actually see photos made from RAW files?
On Mon, 01 Jun 2009 17:48:17 -0800, (Floyd L.
Davidson) wrote: Eric Stevens wrote: On Tue, 02 Jun 2009 00:19:18 +1000, Bob Larter wrote: Eric Stevens wrote: On Sat, 30 May 2009 21:32:27 -0800, (Floyd L. Davidson) wrote: Floyd has failed to point out that at this point he has omitted a great deal of previous text. I know he has been around more than long enough to know that this is not the right thing to do. Eric Stevens wrote: Floyd, I suspect you have been smoking something which is not good for you. Subject to statistical error limitations, there is a one to one correspondence between the source image and the RAW file. One can be converted to the other using the rules inherent in the camera's software. What do you mean by "the source image"? That which is projected onto the sensor by the lens. That's what's encoded in the RAW file. It's a losslessly compressed dump of the contents of the image sensor, without any interpretation. Not so. The sensor stores volts. The RAW file maps this in 12 or 14 bit patterns. Some transformation has clearly taken place. Nor is it necessarily compressed and, if compression has taken place it need not be lossless. It would perhaps have been preferable if Bob had used "interpolation" rather than "interpretation", but in fact he is technically correct anyway. What you describe is encoding of the data, not interpretation or interpolation. He's talking about the *process* of converting from the RAW image to the RGB image that you see on your screen, which includes Bayer deconvolution. As he says, there is no 1:1 relationship between a pixel ("sensel") on the image sensor & a pixel on the RGB image that you see on your screen. My original proposition was that there is a one to one correspondence between the source image (that which is projected onto the sensor by the lens) and data which is stored in the RAW file. Its not possible that any single RAW file could be produced by more than one sensor image. What isn't possible is to use the recorded data in the RAW file to precisely reproduce an image which is exactly the same as the one which was projected onto the sensor. That is indeed simply because there were multiple possible images which could produce the same raw data. So you keep saying. Please explain. This is at the heart of our argument. Eric Stevens |
#44
|
|||
|
|||
Could you actually see photos made from RAW files?
nospam wrote:
In article , Floyd L. Davidson wrote: unless you upscale or downscale, the number of input sensels will be the same as the number of output pixels. That is not true. The Nikon D3, as one example, is specified as having a 12.87 Mega pixel sensor. The images it generates are specified to be 4256x2832, which works out to 12.05 Mega pixels. Dave Coffin's dcraw program generates a 4284x2844 (12.18Mp) image. so he keeps the border pixels. that doesn't negate what i said. It certainly does. "The number of input sensels will be the same as the number of output pixels" is not true. And the difference is not fixed either. It depends on how the interpolation is done. -- Floyd L. Davidson http://www.apaflo.com/floyd_davidson Ukpeagvik (Barrow, Alaska) |
#45
|
|||
|
|||
Could you actually see photos made from RAW files?
In rec.photo.digital Bob Larter wrote:
nospam wrote: In article , Bob Larter wrote: TTBOMK, the only transformation is the A2D conversion. And that lack of transformations is, after all, the whole point of the RAW file format in the first place. basically that's true, however, nikon did apply white balance to the raw data in some cameras before writing it to the file (d1 series, if i recall). i doubt that's what he meant, and as far as i know, it's no longer done. God, I'd hope not! There are two reasons why I shoot RAW: (1) to get the most dynamic range from my shots, ie; to push them a stop or two, & (2) To fix the white balance for shots that were taken under mixed lighting. not to sidetrack, but the only cameras that actually do a transform are sigma/foveon. the raw data in a sigma raw file is *not* what came off the sensor and has gone through quite a bit of processing before being written to the file (which is kinda funny, given the crazy claims about it being 'true colour'). Yuck. IMHO, the whole point of RAW images is that they haven't been screwed around with. It seems that some camera makers don't consider doing some "improvements" to the RAW data from the sensor as "screwing around with" :-) -- Chris Malcolm |
#46
|
|||
|
|||
Could you actually see photos made from RAW files?
Eric Stevens wrote:
On Mon, 01 Jun 2009 18:53:15 -0800, (Floyd L. Davidson) wrote: Eric Stevens wrote: On Mon, 01 Jun 2009 02:45:35 -0800, (Floyd L. Davidson) wrote: I have changed the silly things you've written, and I changed the meaning of a single sentence. Please blame me for what you write! Do you see what happens when you delete text without indicating the fact? I see that you are dishonest. And disgusting. I'm trying to get through to others what is wrong with what you have been doing to me. The only difference is that I confined myself to deleting individual words from the one paragraph and then I pointed out what I had done. You - you have been deleting whole paragraphs so as to completely change the sense of what we have been arguing about. You then compound matters by not even indicating where or that you have snipped. To finish it off, you start preaching about netiquette! You are basically dishonest Eric. First you equate what you did above to what I do. Then you claim that I changed the meaning of what you wrote by deleting unnecessary and unrelated context. Both are dishonest. I have *never* made any attempt at changing the meaning of what you wrote, and your statements saying I have are patently false. And lets be clear that I have *never* edited any quotes of your articles in a way that would change the meaning of what you said. Bull****. And, as you have already said, you have been round Usenet more than long enough to know better. I do know better. You are dishonest, and the above claim is unjustified in the extreme. That has never been a "norm" on Usenet. Bull****. It was the norm when you hung around sci.archaeology. I You are unaware of what is the accepted norm for Usenet, and dishonest to boot. agree it is not common here but people don't normally snip out numerous segments of the article to which they have been responding in the way that you have been doing it. Anyone who understands proper formatting of Usenet articles does! Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor .... ... what analog data? The sensor is an analog device. Not really. It's counting photons. The sensor is an analog device by definition. Look up what an analog device is and stop making up your own definitions. I know what an analog device is. The wells in the sensor count electrons as a proxy for photons. You can't have fractional electrons. You can't have fractional photons. You must have integer values. It cannot be an an analog device. Okay, you don't know what an analog device is, so lets go over that too. The definition of "analog" is that the signal (in this case it would be the output signal from the sensor) is continuously variable (it has an infinite number of possible values). If it is "digital" then it must have a finite set of discrete values. (Note that the two definitions are together all inclusive and are mutually exclusive. Everything is either one or the other and nothing can be both.) Now, you may notice that the output of the sensor is a voltage which is continuously variable over an infinite number of values between 0 and 1 volts. That makes it an *analog* device by definition. That analog data is fed into a device that converts it to digital data. The output has a finite set of discrete values. If, for example, it is a 12-bit digital signal then there are as many as 4096 values. These values are not continuous (i.e., 3 and 4 are discrete and there are no values between them as there would be for an analog signal), hence they are discrete, and the signal is by definition *digital*. The reason I've been stating that there are many possible images which can produce the same digital data set is because the analog signal to produce any single one of those 4096 values has an infinite number of possible values. The reason I've said you cannot precisely restore the original image on the sensor is because you do not know exactly which of the infinite number of possible values for each digital value should actually be used for the analog image. Basic information theory. (Ever heard of Claude E. Shannon??) Thats all part of the firmware which transforms the sensor data in the data of the RAW file. That is *not* part of the firmware. Firmware, other than setting the ISO gain, has no part in any of that other than turning it on and off. You are referring to 'firmware' as though it was 'hardware'. Yet Nikon can program the camera to behave differently so some software/firmware must be involved. I am referring to that as hardware because in fact it is hardware. It is not done with software/firmware. This is much like the way you insisted for article after article that the RAW data had been interpolated... The original argument was over whether or not it was possible to work back from the RAW data and arrive at more than one sensor image. I A nonsense idea, but... ... but that's where we came in. You wrote: "That's what a camera raw file (the so called RAW format) is... a pile of parts that you can build an image from, and while the photographer may have had one specific image in mind when that pile of data was saved, it can be restructured to make a lot of different images too." And I was precisely correct. ... and I responded: "Floyd, I suspect you have been smoking something which is not good for you. Your need to post insults makes you significantly less than credible. Subject to statistical error limitations, there is a one to one correspondence between the source image and the RAW file. One can be converted to the other using the rules inherent in the camera's software. The data in the RAW file can't be restructured to make a different image without changing the data." Basically what I said was that what I just quoted you as saying is a 'nonsense idea'. Basically what you said in the first 5 words is that it is true that you cannot say there is a one to one relationship as you had previously stated. Then you go on to again say something that you've just pointed out isn't true. "Subject to statistical error limitations" is not escapable by just denying that it exists. It does (though that is not really a valid term for what happens), and therefore, because indeed the results *are* subject to variations, you cannot get a one to one correspondence. Indeed, there are an infinite number of different images that could produce exactly the same digital data. That is called "distortion", and is not really a "statistical error limitation", simply because it is precisely know what the digital results will be for each and every know possible analog signal input. Statistical error limitations would actually, however, be a correct term to apply to the noise errors caused by such things as clock jitter, thermal noise, etc. etc. Unless you can explain how a particular electronic charge from the sensor can be saved _in_its_original form_ on a Compact Flash card then you have to accept that the data from the sensor is transformed before it is saved. It is encoded. Not transformed. Encoding doesn't change the results. A transform is where a math formula is applied to change the resulting image (color, exposure, rotation, etc.). (A caterpillar is transformed into a butterfly... Data is encoded in binary or perhaps some m-ary higher level code such as PCM.) In no case is it reversible. You can track back through the logic and determine the original state of the sensor which gave rise to the particular Raw data file. Mind No you cannot. you, You would have to know Nikon's original transformation algorithm before you could work it backwards. I wouldn't like to have to do it. It's common knowledge. Indeed, the whole idea of interpolation is to allow someone to do as close to that as is possible. The difference though is that if you don't know what the end result is supposed to look like (which will allow adjusting parameters until you get an image that looks "right"), it is not possible. That is because there is not enough data retained to determine precisely which of the infinite number of possible original images actually did produce the image. Hence, through interpolation you can get any one of an infinite number of images, but you cannot determine which of them is indeed that original "source image" you've talked about. that only the one sensor image corresponds to any one RAW file data It is easy to show that there are multiple images on the sensor that could produce exactly the same data set. I invite you to do so. There are, for output values at any given sensor location, an infinite number of possible light intensities that could cause each actual possible digital value. That is, if the digital value is 2012, there are an infinite number of light values which would result in a value between 2011 and 2013, and all of them will be assigned to a value of 2012. You cannot determine which is which. Now, least you claim it is insignificant, keep in mind that between values 2 and 4 the light intensity doubles. That means there are, for value 3, an infinite number of values which result from about a 1.4x increase in light on the sensor. Not exactly an insignificant change in light! But you cannot determine, for that range, which is correct over a 1.4x range of light intensity. Well that's a good wriggle but if you read back above you will find there are many points you have to answer. But the fact is that yes I *can* answer all of your questions. The problem is that you stubbornly refuse to learn anything. I would particularly like a clear explanation of how there can be multiple images on the sensor that could produce exactly the same data set. After all, this is where we came in. Then why haven't you made an effort to learn something. I've explained that several times in several different ways, and have done so again in another way in this article. It isn't rocket science. But it *is* something I've been dealing with for 40 years now... and hence it is not at all surprising that I might understand it rather well and can explain it in 15 different ways. But it does make all of your petty insults a bit of a joke, Eric. -- Floyd L. Davidson http://www.apaflo.com/floyd_davidson Ukpeagvik (Barrow, Alaska) |
#47
|
|||
|
|||
Could you actually see photos made from RAW files?
In message , Floyd L. Davidson
writes Everything is either one or the other and nothing can be both.) This is not correct. There are plenty of mixed devices about. Analog Devices make a few of them. That is *not* part of the firmware. Firmware, other than setting the ISO gain, has no part in any of that other than turning it on and off. You are referring to 'firmware' as though it was 'hardware'. Yet Nikon can program the camera to behave differently so some software/firmware must be involved. Perfectly correct. I am referring to that as hardware because in fact it is hardware. It is not done with software/firmware. Yes it is. What is more I can supply the tools to write the firmware. (I can't tell you which but We have supplied software/firmware tools to more than one OEM digital camera company (P&S variety) -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ |
#48
|
|||
|
|||
Could you actually see photos made from RAW files?
On Tue, 02 Jun 2009 02:09:19 -0800, (Floyd L.
Davidson) wrote: Eric Stevens wrote: On Mon, 01 Jun 2009 18:53:15 -0800, (Floyd L. Davidson) wrote: Eric Stevens wrote: On Mon, 01 Jun 2009 02:45:35 -0800, (Floyd L. Davidson) wrote: I have changed the silly things you've written, and I changed the meaning of a single sentence. Please blame me for what you write! Do you see what happens when you delete text without indicating the fact? I see that you are dishonest. And disgusting. I'm trying to get through to others what is wrong with what you have been doing to me. The only difference is that I confined myself to deleting individual words from the one paragraph and then I pointed out what I had done. You - you have been deleting whole paragraphs so as to completely change the sense of what we have been arguing about. You then compound matters by not even indicating where or that you have snipped. To finish it off, you start preaching about netiquette! You are basically dishonest Eric. First you equate what you did above to what I do. Then you claim that I changed the meaning of what you wrote by deleting unnecessary and unrelated context. Both are dishonest. I have *never* made any attempt at changing the meaning of what you wrote, and your statements saying I have are patently false. When you go to court you swear to tell the truth, the whole truth and nothing but the truth. Even if what you quote of mine is quoted accurately, leaving out large chunks of it has me saying something other than what I actually said. It can be OK for you to do that if you indicate that that is what you have done. But you don't do that. You just delete blocks of text. This can result in me appearing to say something other than what I actually said. It can also result in the deletion of the explanation and justification of what I did say. It took me a while to wake up to what you were doing and then I found that when I responded to you I had to go back to my previous article to discover what you had actually done to it in your response to which I was responding. You can't claim that you don't know any better than this if only for the twenty years of experience you have had with Usenet. What you have been doing would not have been tolerated for one second twenty years ago (yeah - I was there too). And lets be clear that I have *never* edited any quotes of your articles in a way that would change the meaning of what you said. Bull****. And, as you have already said, you have been round Usenet more than long enough to know better. I do know better. If you do know better, why do you do it? You are dishonest, and the above claim is unjustified in the extreme. What? That "you have been round Usenet more than long enough to know better". That has never been a "norm" on Usenet. Bull****. It was the norm when you hung around sci.archaeology. I You are unaware of what is the accepted norm for Usenet, and dishonest to boot. See http://www.softdevlabs.com/personal/Usenet101.html "Please remember that when you do snip portions of a person's post, it's extremely important that you not only indicate that you've done so, but also where you've done so. [5] The commonly accepted way to indicate snips is to simply insert the string "snip" (or similar notation) on a line by itself at the spot where the deleted text used to be. " I expect that it will be possible to show you are a liar if you claim to have never been exposed to this practice in you 20 years of Usenet. agree it is not common here but people don't normally snip out numerous segments of the article to which they have been responding in the way that you have been doing it. Anyone who understands proper formatting of Usenet articles does! And they mark where they have snipped! Note that "sensor data", in the context of this discussion, would be the analog data directly read from the sensor .... ... what analog data? The sensor is an analog device. Not really. It's counting photons. The sensor is an analog device by definition. Look up what an analog device is and stop making up your own definitions. I know what an analog device is. The wells in the sensor count electrons as a proxy for photons. You can't have fractional electrons. You can't have fractional photons. You must have integer values. It cannot be an an analog device. Okay, you don't know what an analog device is, so lets go over that too. The definition of "analog" is that the signal (in this case it would be the output signal from the sensor) is continuously variable (it has an infinite number of possible values). If it is "digital" then it must have a finite set of discrete values. (Note that the two definitions are together all inclusive and are mutually exclusive. Everything is either one or the other and nothing can be both.) Haw. You are now claiming that it is possible to have a continuously variable number of electrons. I maintain the number of electrons can only be represented by integers. Now, you may notice that the output of the sensor is a voltage which is continuously variable over an infinite number of values between 0 and 1 volts. That makes it an *analog* device by definition. The output of the sensor is an electrical charge. The electrical charge is dumped into a charge amplifier and it is this which outputs the voltage. This is the first step in transforming sensor image into the RAW data file. That analog data is fed into a device that converts it to digital data. The output of the charge amplifier is then digitised. This is the second step in transforming sensor image into the RAW data file. The output has a finite set of discrete values. If, for example, it is a 12-bit digital signal then there are as many as 4096 values. In the case of the Nikon D300 the RAW file can be output as either 12 bit or 14 bit. It is likely that before it can be transformed into either of those formats it is processed in the camera in some other format. These values are not continuous (i.e., 3 and 4 are discrete and there are no values between them as there would be for an analog signal), hence they are discrete, and the signal is by definition *digital*. Before it hits the RAW file, the data is further massaged. The reason I've been stating that there are many possible images which can produce the same digital data set is because the analog signal to produce any single one of those 4096 values has an infinite number of possible values. Not so. The digital value of say 1612 corresponds with only one state of the particular sensor element. If you know the chain of transformations between the sensels and the corresponding data of the RAW file you can work it backwards to derive from the RAW file the state of the individual sensels which gave rise to the data in the first place. The reason I've said you cannot precisely restore the original image on the sensor is because you do not know exactly which of the infinite number of possible values for each digital value should actually be used for the analog image. But you do if you understand the nature of the transformation. Basic information theory. (Ever heard of Claude E. Shannon??) Of course I have. What exactly does he have to do with it? Thats all part of the firmware which transforms the sensor data in the data of the RAW file. That is *not* part of the firmware. Firmware, other than setting the ISO gain, has no part in any of that other than turning it on and off. You are referring to 'firmware' as though it was 'hardware'. Yet Nikon can program the camera to behave differently so some software/firmware must be involved. I am referring to that as hardware because in fact it is hardware. It is not done with software/firmware. This is much like the way you insisted for article after article that the RAW data had been interpolated... You had led me onto the garden path for that one. Nevertheless there is software between the formation of the image on the sensor and the writing of the data to the RAW file. That is why/how various problems (e.g. vertical stripes) can be cured by a firmware upgrade. The original argument was over whether or not it was possible to work back from the RAW data and arrive at more than one sensor image. I A nonsense idea, but... Its not at all nonsense, even if you think it is so. ... but that's where we came in. You wrote: "That's what a camera raw file (the so called RAW format) is... a pile of parts that you can build an image from, and while the photographer may have had one specific image in mind when that pile of data was saved, it can be restructured to make a lot of different images too." And I was precisely correct. ... and I responded: "Floyd, I suspect you have been smoking something which is not good for you. Your need to post insults makes you significantly less than credible. It was meant to be good-natured banter. Subject to statistical error limitations, there is a one to one correspondence between the source image and the RAW file. One can be converted to the other using the rules inherent in the camera's software. The data in the RAW file can't be restructured to make a different image without changing the data." Basically what I said was that what I just quoted you as saying is a 'nonsense idea'. Basically what you said in the first 5 words is that it is true that you cannot say there is a one to one relationship as you had previously stated. You are I suppose referring to my use of "Subject to statistical error limitations,.. ". Then you go on to again say something that you've just pointed out isn't true. "Subject to statistical error limitations" is not escapable by just denying that it exists. It does (though that is not really a valid term for what happens), and therefore, because indeed the results *are* subject to variations, you cannot get a one to one correspondence. Is this now how you are explaining your belief that more than one image corresponds to a patricular RAW file? Its like saying more than one house can be built from a set of plans because the builder will read their tape measure slightly differently each time. Well, that's what I meant when I wrote of "Subject to statistical error limitations". Indeed, there are an infinite number of different images that could produce exactly the same digital data. That is called "distortion", and is not really a "statistical error limitation", simply because it is precisely know what the digital results will be for each and every know possible analog signal input. We are talking about images projected on to the sensor. There is fixed correlation between the light which has fallen on a sensel and the data which is recorded in the RAW file. Similarly there is a fixed correlation between the data in the RAW file and the light which fell on the sensel which created the data. If you know one, you can determine the other. Statistical error limitations would actually, however, be a correct term to apply to the noise errors caused by such things as clock jitter, thermal noise, etc. etc. And uncertainty in A to Conversion. Unless you can explain how a particular electronic charge from the sensor can be saved _in_its_original form_ on a Compact Flash card then you have to accept that the data from the sensor is transformed before it is saved. It is encoded. Not transformed. Encoding doesn't change the results. A transform is where a math formula is applied to change the resulting image (color, exposure, rotation, etc.). That's one form of transformation. another is where in a system of cartesian coordinates the point of origin is changed. This requires that all the coordinates defining a shape be changed (transformed) but does not change the shape. Or the data may be transformed from cartesian format to polar. Encoding a message does not change the message but it does transform it. (A caterpillar is transformed into a butterfly... Data is encoded in binary or perhaps some m-ary higher level code such as PCM.) In no case is it reversible. You can track back through the logic and determine the original state of the sensor which gave rise to the particular Raw data file. Mind No you cannot. Why not? you, You would have to know Nikon's original transformation algorithm before you could work it backwards. I wouldn't like to have to do it. It's common knowledge. There are an awful lot of people who wish it was. Indeed, the whole idea of interpolation is to allow someone to do as close to that as is possible. The difference though is that if you don't know what the end result is supposed to look like (which will allow adjusting parameters until you get an image that looks "right"), it is not possible. That is because there is not enough data retained to determine precisely which of the infinite number of possible original images actually did produce the image. I think you are back to the image created on a screen (or whatever) from the RAW data. Hence, through interpolation you can get any one of an infinite number of images, but you cannot determine which of them is indeed that original "source image" you've talked about. that only the one sensor image corresponds to any one RAW file data It is easy to show that there are multiple images on the sensor that could produce exactly the same data set. I invite you to do so. There are, for output values at any given sensor location, an infinite number of possible light intensities that could cause each actual possible digital value. That is, if the digital value is 2012, there are an infinite number of light values which would result in a value between 2011 and 2013, and all of them will be assigned to a value of 2012. You forget you are dealing with electrons, which can only be described by integer numbers. You cannot determine which is which. Now, least you claim it is insignificant, keep in mind that between values 2 and 4 the light intensity doubles. That means there are, for value 3, an infinite number of values which result from about a 1.4x increase in light on the sensor. Not exactly an insignificant change in light! But you cannot determine, for that range, which is correct over a 1.4x range of light intensity. See http://www.clarkvision.com/imagedeta...mance.summary/ for a better indication of the number of electrons you can expect to deal with: 50,000 or more. Well that's a good wriggle but if you read back above you will find there are many points you have to answer. But the fact is that yes I *can* answer all of your questions. The problem is that you stubbornly refuse to learn anything. I would particularly like a clear explanation of how there can be multiple images on the sensor that could produce exactly the same data set. After all, this is where we came in. Then why haven't you made an effort to learn something. I've explained that several times in several different ways, and have done so again in another way in this article. It isn't rocket science. But it *is* something I've been dealing with for 40 years now... and hence it is not at all surprising that I might understand it rather well and can explain it in 15 different ways. But it does make all of your petty insults a bit of a joke, Eric. Congratulations! 40 years ago you were working in Bell Labs along with Boyle and Smith! And I thought you were only a psychologist. Eric Stevens |
#49
|
|||
|
|||
Could you actually see photos made from RAW files?
On Wed, 03 Jun 2009 11:37:23 +1000, Bob Larter
wrote: Eric Stevens wrote: On Mon, 01 Jun 2009 18:27:05 -0800, (Floyd L. Davidson) wrote: Eric Stevens wrote: I've got to plead guilty to that. In a moment of brain fade I got sucked into what Floyd was trying to talk about rather than the original topic which I was trying to talk about. Rereading all this below I can see that I had become more than somewhat confused. Lets be more honest about it Eric. I didn't confuse you at all, simply because you started out in total confusion. It's finally beginning to sink in, to the point where even you can see it. I'm talking in the hypothetical sense of being able to derive from the RAW data the light pattern which fell on the sensor to create the RAW data in the first place. In the case of RAW data which has not been messed around in some way there is only the one sensor image which will correspond. That last sentence is not valid, and is a major source for your confusion. As nospam said, that doesn't make sense. Are you really saying that a given RAW data file can be created by more than one image? Yes. If you think carefully about how an image sensor works, it's obvious. I may be missing something but its not obvious to me. A lens directs light from a scene so as to form an image on the camera's sensor. Different parts of the image fall on individual sensels which, in the time allowed to them, capture photons which generate electrons. The accumulated electrons form an electrical charge in each sensel. According to the type of sensel, the charge is 'read' in one way or another, and the quantity of charge converted to digital data. The digital value of the charge is saved in an array which enables the value of the charge for each individual sensel to be mapped to the position of the sensel. Leaving out the question of whether or not the data is further massaged by the camera before-hand, the data array is saved as the RAW data file for the image. That original image on the sensor is characterised by by the raw data array. Any change in the image gives rise to a different data array. As far as I can see only the one image can give rise to a particular data array. I would be grateful for an explanation of how a different image can give rise to the same data array. Eric Stevens |
#50
|
|||
|
|||
Could you actually see photos made from RAW files?
On Wed, 03 Jun 2009 11:34:55 +1000, Bob Larter
wrote: Eric Stevens wrote: On Mon, 01 Jun 2009 06:12:17 -0400, nospam wrote: In article , Eric Stevens wrote: [..] I've quoted from the original articls, and you still keep trying to switch to conversion from RAW to JPG. That's an entirely different question. Then stop talking about processing the RAW data to make an image. I haven't been. If anything I've been talking about working backwards from the RAW data file to reconstruct the original image. that doesn't make any sense. I'm talking in the hypothetical sense of being able to derive from the RAW data the light pattern which fell on the sensor to create the RAW data in the first place. In the case of RAW data which has not been messed around in some way there is only the one sensor image which will correspond. That's incorrect. In a Bayer pattern image sensor, for example, fully half of the sensels can't 'see' red or blue light. As a theoretical example, imagine if you projected a series of images onto the sensor, patterned such that the only difference was in the amounts of red/blue light on the green-sensitive sensels, you would get a series of identical RAW files. This is obviously a very contrived example, but it demonstrates the possibility. Yes, I will accept that, but thats a very special case. It would in fact require a source image composed of elements below the resolution of the sensor. I don't think that's what we had in mind. Eric Stevens |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Could you actually see photos made from RAW files? | mcdonaldREMOVE TO ACTUALLY REACH [email protected] | Digital Photography | 33 | June 3rd 09 07:32 AM |
Could you actually see photos made from RAW files? | Savageduck[_2_] | Digital Photography | 8 | June 1st 09 04:22 AM |
Could you actually see photos made from RAW files? | Steven Green[_3_] | Digital Photography | 0 | May 30th 09 09:27 PM |
Could you actually see photos made from RAW files? | nospam | Digital Photography | 0 | May 30th 09 09:18 PM |
Could you actually see photos made from RAW files? | Trev | Digital Photography | 0 | May 30th 09 09:18 PM |