If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
"Exposure" vs "Digitization
After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to refer to the relative degree of photon saturation in a JPEG or RAW at a given ISO. The analog to slide film exposure is actually the analog exposure on the sensor; the ISO settings of the digital camera are like setting different ranges of exposure in a slide to be digitized by a scanner. Why then, do we call utilizing the specified range "exposure". I often substitute the word "digitized" in this context, but it draws strange reactions from some people. -- John P Sheehy |
#3
|
|||
|
|||
In message ,
Gregory Blank wrote: In article , wrote: After spending some time under the hood, so to speak, of RAW capture and data, I find it increasingly difficult to use the term "exposure" to refer to the relative degree of photon saturation in a JPEG or RAW at a given ISO. The analog to slide film exposure is actually the analog exposure on the sensor; the ISO settings of the digital camera are like setting different ranges of exposure in a slide to be digitized by a scanner. Why then, do we call utilizing the specified range "exposure". I often substitute the word "digitized" in this context, but it draws strange reactions from some people. In truth its not about exposure, analog or digital....its selective contrast determination and what can be recorded within the parameters of the hardware. That is; its More or less about what you wish to drop or pick up when you select to use analog or digital. But only within a narrow reference as given by the maker of film or the maker of the camera. Perhaps its more an issue of word choice for people less able to grasp the concept. But you are exposing the sensor to light, so you are making an exposure. To answer you quite directly: for lack of using a better description and to be concise. If someone decides that "ISO 100 gives the best quality" and gets an image that utilizes only 1/16th of the RAW values available, they would have had a much better image if they had the camera set to ISO 1600 with the same aperture and shutter speed. I have a hard time saying that they "under-exposed" the image; it makes more sense to say that they under-digitized it (quantized it) by using too low of an ISO. -- John P Sheehy |
#4
|
|||
|
|||
In article ,
wrote: To answer you quite directly: for lack of using a better description and to be concise. If someone decides that "ISO 100 gives the best quality" and gets an image that utilizes only 1/16th of the RAW values available, they would have had a much better image if they had the camera set to ISO 1600 with the same aperture and shutter speed. I have a hard time saying that they "under-exposed" the image; it makes more sense to say that they under-digitized it (quantized it) by using too low of an ISO. Thats a rather extreme example,... & it seems unlikely. That is: is it better for noise, range and color? Or does one make the choice to keep two and drop one from the equation? Because if its better for all three you would only need one ISO setting and not any supplemental light sources. You add flash/or lights as needed to make the Iso 100 image. Digital does not solve the problems that exist beyond the scope of the camera- lighting. & more likely It never will. Lighting is separate set of issues and require knowledge. I can't seem to state this enough to people, its something schools should teach -- LF Website @ http://members.verizon.net/~gregoryblank "To announce that there must be no criticism of the President, or that we are to stand by the President, right or wrong, is not only unpatriotic and servile, but is morally treasonable to the American public."--Theodore Roosevelt, May 7, 1918 |
#5
|
|||
|
|||
In message ,
Gregory Blank wrote: In article , wrote: To answer you quite directly: for lack of using a better description and to be concise. If someone decides that "ISO 100 gives the best quality" and gets an image that utilizes only 1/16th of the RAW values available, they would have had a much better image if they had the camera set to ISO 1600 with the same aperture and shutter speed. I have a hard time saying that they "under-exposed" the image; it makes more sense to say that they under-digitized it (quantized it) by using too low of an ISO. Thats a rather extreme example,... & it seems unlikely. If you think that's unlikely, you haven't been reading people's posts, or DPReview. The problem of people under-digitizing at ISO 100 is epidemic, because of the myth that ISO settings cause noise. "Why is the sky so noisy in my ISO 100 picture" is a common question. Of course, it is not just sensor-noisy, it's also highly posterized as well, and would have looked better if taken at a higher ISO setting, even with the same aperture and shutter speed. If they were using a tripod, of course, they could have had a good digitization at a higher absolute exposure at a lower ISO. I personally don't use ISO 100 very often, but aim for ISO 200 if I can do it with a full digitization. Blooming looms just above RAW value 4095 at ISO 100. In my experiments, the trade-off between sterile posterization and noise indicates that there is very little value in using ISO 100 over ISO 200 on my Canon 20D. The shadows are of approximately equal worth. That is: is it better for noise, range and color? Or does one make the choice to keep two and drop one from the equation? Because if its better for all three you would only need one ISO setting and not any supplemental light sources. You add flash/or lights as needed to make the Iso 100 image. Not in available light photography, you don't. Digital does not solve the problems that exist beyond the scope of the camera- lighting. & more likely It never will. Lighting is separate set of issues and require knowledge. Optimal lighting is different for digital and film. Color film generally wants to see sunlight or tungsten, depending on the film. Most digitals have neither sunlight nor tungsten as their native white balance. The native balances generally run from magenta to pink lighting with RGB bayer cameras. My Canon DSLRs get the best images with lighting that is a stop stronger red than green, and a half stop stronger blue than green. I can't seem to state this enough to people, its something schools should teach Should be taught specific to digital, but I doubt that there are many teachjers who know the difference. Available-light photography can only be improved by maximizing exposure without clipping, or using filters over the lens if there is enough light. -- John P Sheehy |
#6
|
|||
|
|||
wrote:
After spending some time under the hood, so to speak, of RAW capture and data, I find it increasingly difficult to use the term "exposure" to refer to the relative degree of photon saturation in a JPEG or RAW at a given ISO. The analog to slide film exposure is actually the analog exposure on the sensor; the ISO settings of the digital camera are like setting different ranges of exposure in a slide to be digitized by a scanner. Why then, do we call utilizing the specified range "exposure". I often substitute the word "digitized" in this context, but it draws strange reactions from some people. This is photography so photographic terms apply. There is nothing wrong with the word exposure for digital capture. After all the sensor is exposed to light for a period of time and during that time the sensors 'charge up' from the expsoure and then the data is recorded. From Webster's: 4 : a piece or section of sensitized material (as film) on which an exposure is or can be made 36 exposures per roll While they state film, the "material" can be anything that is sensitive to photons including the sensors (sites) that make up the sensor array in the camera. Regarding RAW processing, it is analogous (at a high enough level) to the adjustments one might make in the darkroom (pushing, puling, burning, dodging, pre-flashing the paper ... etc.) For that matter, the same applies to scanners. You may be right about the term "digitization" but you draw strange looks becasue it is not the famillar term. And there really is nothing wrong with the term exposure. That word says is all: Time X aperture. Cheers, Alan -- -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm -- [SI] gallery & rulz: http://www.pbase.com/shootin -- e-meil: Remove FreeLunch. |
#7
|
|||
|
|||
|
#8
|
|||
|
|||
In message ,
Alan Browne wrote: This is photography so photographic terms apply. There is nothing wrong with the word exposure for digital capture. After all the sensor is exposed to light for a period of time and during that time the sensors 'charge up' from the expsoure and then the data is recorded. From Webster's: 4 : a piece or section of sensitized material (as film) on which an exposure is or can be made 36 exposures per roll While they state film, the "material" can be anything that is sensitive to photons including the sensors (sites) that make up the sensor array in the camera. I didn't say that the term exposure doesn't apply at all. I said that it wasn't a good way to describe the relative digitization at an ISO setting. Regarding RAW processing, it is analogous (at a high enough level) to the adjustments one might make in the darkroom (pushing, puling, burning, dodging, pre-flashing the paper ... etc.) For that matter, the same applies to scanners. You may be right about the term "digitization" but you draw strange looks becasue it is not the famillar term. And there really is nothing wrong with the term exposure. That word says is all: Time X aperture. This is what I'm talking about; this is what I think exposure really means (for a given subject intensity, of course). It is often used, however, for the relative brightness of an image converted with 0 exposure adjustment, which, IMO, is more appropriately called digitization. -- John P Sheehy |
#9
|
|||
|
|||
Alan Browne wrote:
wrote: After spending some time under the hood, so to speak, of RAW capture and data, I find it increasingly difficult to use the term "exposure" to refer to the relative degree of photon saturation in a JPEG or RAW at a given ISO. The analog to slide film exposure is actually the analog exposure on the sensor; the ISO settings of the digital camera are like setting different ranges of exposure in a slide to be digitized by a scanner. Why then, do we call utilizing the specified range "exposure". I often substitute the word "digitized" in this context, but it draws strange reactions from some people. This is photography so photographic terms apply. There is nothing wrong with the word exposure for digital capture. After all the sensor is exposed to light for a period of time and during that time the sensors 'charge up' from the expsoure and then the data is recorded. From Webster's: 4 : a piece or section of sensitized material (as film) on which an exposure is or can be made 36 exposures per roll While they state film, the "material" can be anything that is sensitive to photons including the sensors (sites) that make up the sensor array in the camera. Regarding RAW processing, it is analogous (at a high enough level) to the adjustments one might make in the darkroom (pushing, puling, burning, dodging, pre-flashing the paper ... etc.) For that matter, the same applies to scanners. You may be right about the term "digitization" but you draw strange looks becasue it is not the famillar term. And there really is nothing wrong with the term exposure. That word says is all: Time X aperture. Cheers, Alan I would agree with Alan that "exposure" is a reasonable term to use, where the product of the light intensity and a time determines the number of photo-electrons which will be created. To me, as a signal-processing person, digitisation is a process that happens at a single point in time, where an analog value is converted into a digital one. It has nothing to do with the actual value obtained, nor the time taken to acquire that value. While we are talking about terms, I find JPS's use of the term "posterisation" confusing at best, and meaningless in a signal (or image) processing context. I think what he means may be "quantisation", the fact that an infinite range of analog values must be represented by a limited range of digital levels. Normally, sufficient digital levels are available and it is the accuracy of the analog signal which determines the signal-to-noise ratio of the system. However, if the quantisation steps are too large, it becomes the quantisation process itself which limits the signal-to-noise ratio. An example from audio might be that when sounds are digitised to 8-bits rather than 16-bits, there is an added roughness to the sound. An error is introduced which depends on signal level. In an image, too few bits would show as a contouring effect where what should be a smooth transition of brightness levels instead appears as a finite series of perceptibly different brightness bands. I would call this effect quantisation errors, or more specifically errors due to using too few bits to represent the signal. Is this what you mean by posterisation? David |
#10
|
|||
|
|||
|
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
COMM: Australia only- film prices | Karl | General Equipment For Sale | 1 | February 9th 05 01:25 AM |
What densities at which zones? | ~BitPump | Large Format Photography Equipment | 24 | August 13th 04 04:15 AM |
Kodak on Variable Film Development: NO! | Michael Scarpitti | In The Darkroom | 276 | August 12th 04 10:42 PM |
Digital Exposure Question -- Middle Gray vs Exposure At Highlights | MikeS | Digital Photography | 1 | June 24th 04 08:04 AM |
Develper for Delta-100 | Frank Pittel | In The Darkroom | 8 | March 1st 04 04:36 PM |