PhotoBanter.com

PhotoBanter.com (http://www.photobanter.com/index.php)
-   Digital SLR Cameras (http://www.photobanter.com/forumdisplay.php?f=21)
-   -   "Exposure" vs "Digitization (http://www.photobanter.com/showthread.php?t=48644)

[email protected] August 7th 05 01:13 PM

"Exposure" vs "Digitization
 
After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.

Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.
--


John P Sheehy


Gregory Blank August 7th 05 03:03 PM

In article ,
wrote:

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.



Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


In truth its not about exposure, analog or digital....its selective
contrast determination and what can be recorded within the parameters of
the hardware. That is; its More or less about what you wish to drop or
pick up when you select to use analog or digital. But only within a
narrow reference as given by the maker of film or the maker of the
camera.

Perhaps its more an issue of word choice for people less able to grasp
the concept. But you are exposing the sensor to light, so you are making
an exposure.

To answer you quite directly: for lack of using a better description
and to be concise.

--
LF Website @
http://members.verizon.net/~gregoryblank

"To announce that there must be no criticism of the President,
or that we are to stand by the President, right or wrong,
is not only unpatriotic and servile, but is morally treasonable
to the American public."--Theodore Roosevelt, May 7, 1918

[email protected] August 7th 05 03:24 PM

In message ,
Gregory Blank wrote:

In article ,
wrote:


After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.


Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


In truth its not about exposure, analog or digital....its selective
contrast determination and what can be recorded within the parameters of
the hardware. That is; its More or less about what you wish to drop or
pick up when you select to use analog or digital. But only within a
narrow reference as given by the maker of film or the maker of the
camera.


Perhaps its more an issue of word choice for people less able to grasp
the concept. But you are exposing the sensor to light, so you are making
an exposure.


To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.
--


John P Sheehy


Gregory Blank August 7th 05 03:59 PM

In article ,
wrote:

To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.



Thats a rather extreme example,... & it seems unlikely.

That is: is it better for noise, range and color? Or does one make the
choice to keep two and drop one from the equation? Because if its better
for all three you would only need one ISO setting and not any
supplemental light sources.

You add flash/or lights as needed to make the Iso 100 image. Digital
does not solve the problems that exist beyond the scope of the camera-
lighting. & more likely It never will. Lighting is separate set of
issues and require knowledge. I can't seem to state this enough to
people, its something schools should teach :)

--
LF Website @
http://members.verizon.net/~gregoryblank

"To announce that there must be no criticism of the President,
or that we are to stand by the President, right or wrong,
is not only unpatriotic and servile, but is morally treasonable
to the American public."--Theodore Roosevelt, May 7, 1918

[email protected] August 7th 05 04:56 PM

In message ,
Gregory Blank wrote:

In article ,
wrote:

To answer you quite directly: for lack of using a better description
and to be concise.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.


Thats a rather extreme example,... & it seems unlikely.


If you think that's unlikely, you haven't been reading people's posts,
or DPReview. The problem of people under-digitizing at ISO 100 is
epidemic, because of the myth that ISO settings cause noise.

"Why is the sky so noisy in my ISO 100 picture" is a common question.
Of course, it is not just sensor-noisy, it's also highly posterized as
well, and would have looked better if taken at a higher ISO setting,
even with the same aperture and shutter speed. If they were using a
tripod, of course, they could have had a good digitization at a higher
absolute exposure at a lower ISO. I personally don't use ISO 100 very
often, but aim for ISO 200 if I can do it with a full digitization.
Blooming looms just above RAW value 4095 at ISO 100. In my experiments,
the trade-off between sterile posterization and noise indicates that
there is very little value in using ISO 100 over ISO 200 on my Canon
20D. The shadows are of approximately equal worth.

That is: is it better for noise, range and color? Or does one make the
choice to keep two and drop one from the equation? Because if its better
for all three you would only need one ISO setting and not any
supplemental light sources.


You add flash/or lights as needed to make the Iso 100 image.


Not in available light photography, you don't.

Digital
does not solve the problems that exist beyond the scope of the camera-
lighting. & more likely It never will. Lighting is separate set of
issues and require knowledge.


Optimal lighting is different for digital and film. Color film
generally wants to see sunlight or tungsten, depending on the film.
Most digitals have neither sunlight nor tungsten as their native white
balance. The native balances generally run from magenta to pink
lighting with RGB bayer cameras. My Canon DSLRs get the best images
with lighting that is a stop stronger red than green, and a half stop
stronger blue than green.

I can't seem to state this enough to
people, its something schools should teach :)


Should be taught specific to digital, but I doubt that there are many
teachjers who know the difference.

Available-light photography can only be improved by maximizing exposure
without clipping, or using filters over the lens if there is enough
light.

--


John P Sheehy


Alan Browne August 7th 05 05:18 PM

wrote:

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO. The analog to slide film exposure is actually the analog
exposure on the sensor; the ISO settings of the digital camera are like
setting different ranges of exposure in a slide to be digitized by a
scanner.

Why then, do we call utilizing the specified range "exposure". I often
substitute the word "digitized" in this context, but it draws strange
reactions from some people.


This is photography so photographic terms apply.

There is nothing wrong with the word exposure for digital capture.
After all the sensor is exposed to light for a period of time and during
that time the sensors 'charge up' from the expsoure and then the data is
recorded.

From Webster's:
4 : a piece or section of sensitized material (as film) on which an
exposure is or can be made 36 exposures per roll

While they state film, the "material" can be anything that is sensitive
to photons including the sensors (sites) that make up the sensor array
in the camera.

Regarding RAW processing, it is analogous (at a high enough level) to
the adjustments one might make in the darkroom (pushing, puling,
burning, dodging, pre-flashing the paper ... etc.)

For that matter, the same applies to scanners.

You may be right about the term "digitization" but you draw strange
looks becasue it is not the famillar term. And there really is nothing
wrong with the term exposure. That word says is all: Time X aperture.

Cheers,
Alan

--
-- r.p.e.35mm user resource:
http://www.aliasimages.com/rpe35mmur.htm
-- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
-- [SI] gallery & rulz: http://www.pbase.com/shootin
-- e-meil: Remove FreeLunch.

[email protected] August 7th 05 05:37 PM

wrote:

After spending some time under the hood, so to speak, of RAW capture and
data, I find it increasingly difficult to use the term "exposure" to
refer to the relative degree of photon saturation in a JPEG or RAW at a
given ISO.


If someone decides that "ISO 100 gives the best quality" and gets an
image that utilizes only 1/16th of the RAW values available, they would
have had a much better image if they had the camera set to ISO 1600 with
the same aperture and shutter speed. I have a hard time saying that
they "under-exposed" the image; it makes more sense to say that they
under-digitized it (quantized it) by using too low of an ISO.


Is or is not the ISO "given", as you initially described the situation?

If by "given" you mean "fixed in advance", then:

If ISO is "given", "under exposure" (or "over exposure") describe the
situation correctly.

If the exposure is "given", then "under exposure" (etc) are also
appropriate because of historical use (even today ISO is not as
variable as exposure), but also because the assumptions are known. But
if you don't like it, use "under amplified", "lack of gain" or
something and no one will quibble.


[email protected] August 7th 05 05:38 PM

In message ,
Alan Browne wrote:

This is photography so photographic terms apply.


There is nothing wrong with the word exposure for digital capture.
After all the sensor is exposed to light for a period of time and during
that time the sensors 'charge up' from the expsoure and then the data is
recorded.

From Webster's:
4 : a piece or section of sensitized material (as film) on which an
exposure is or can be made 36 exposures per roll

While they state film, the "material" can be anything that is sensitive
to photons including the sensors (sites) that make up the sensor array
in the camera.


I didn't say that the term exposure doesn't apply at all. I said that
it wasn't a good way to describe the relative digitization at an ISO
setting.

Regarding RAW processing, it is analogous (at a high enough level) to
the adjustments one might make in the darkroom (pushing, puling,
burning, dodging, pre-flashing the paper ... etc.)

For that matter, the same applies to scanners.

You may be right about the term "digitization" but you draw strange
looks becasue it is not the famillar term. And there really is nothing
wrong with the term exposure. That word says is all: Time X aperture.


This is what I'm talking about; this is what I think exposure really
means (for a given subject intensity, of course). It is often used,
however, for the relative brightness of an image converted with 0
exposure adjustment, which, IMO, is more appropriately called
digitization.
--


John P Sheehy


David J Taylor August 7th 05 05:39 PM

Alan Browne wrote:
wrote:

After spending some time under the hood, so to speak, of RAW capture
and data, I find it increasingly difficult to use the term
"exposure" to refer to the relative degree of photon saturation in a
JPEG or RAW at a given ISO. The analog to slide film exposure is
actually the analog exposure on the sensor; the ISO settings of the
digital camera are like setting different ranges of exposure in a
slide to be digitized by a scanner.

Why then, do we call utilizing the specified range "exposure". I
often substitute the word "digitized" in this context, but it draws
strange reactions from some people.


This is photography so photographic terms apply.

There is nothing wrong with the word exposure for digital capture.
After all the sensor is exposed to light for a period of time and
during that time the sensors 'charge up' from the expsoure and then
the data is recorded.

From Webster's:
4 : a piece or section of sensitized material (as film) on which an
exposure is or can be made 36 exposures per roll

While they state film, the "material" can be anything that is
sensitive to photons including the sensors (sites) that make up the
sensor array in the camera.

Regarding RAW processing, it is analogous (at a high enough level) to
the adjustments one might make in the darkroom (pushing, puling,
burning, dodging, pre-flashing the paper ... etc.)

For that matter, the same applies to scanners.

You may be right about the term "digitization" but you draw strange
looks becasue it is not the famillar term. And there really is
nothing wrong with the term exposure. That word says is all: Time X
aperture.
Cheers,
Alan


I would agree with Alan that "exposure" is a reasonable term to use, where
the product of the light intensity and a time determines the number of
photo-electrons which will be created.

To me, as a signal-processing person, digitisation is a process that
happens at a single point in time, where an analog value is converted into
a digital one. It has nothing to do with the actual value obtained, nor
the time taken to acquire that value.

While we are talking about terms, I find JPS's use of the term
"posterisation" confusing at best, and meaningless in a signal (or image)
processing context. I think what he means may be "quantisation", the fact
that an infinite range of analog values must be represented by a limited
range of digital levels. Normally, sufficient digital levels are
available and it is the accuracy of the analog signal which determines the
signal-to-noise ratio of the system. However, if the quantisation steps
are too large, it becomes the quantisation process itself which limits the
signal-to-noise ratio.

An example from audio might be that when sounds are digitised to 8-bits
rather than 16-bits, there is an added roughness to the sound. An error
is introduced which depends on signal level. In an image, too few bits
would show as a contouring effect where what should be a smooth transition
of brightness levels instead appears as a finite series of perceptibly
different brightness bands.

I would call this effect quantisation errors, or more specifically errors
due to using too few bits to represent the signal. Is this what you mean
by posterisation?

David



David J Taylor August 7th 05 06:24 PM

wrote:
[]
This is what I'm talking about; this is what I think exposure really
means (for a given subject intensity, of course). It is often used,
however, for the relative brightness of an image converted with 0
exposure adjustment, which, IMO, is more appropriately called
digitization.


In signal processing we might use the term "headroom". If the sound level
a system could capture was 8, and the loudest sound to be recorded was 4,
then we might say the headroom was 6dB (an engineering term for a factor
of 2 in linear voltage or current terms). Similarly, if the maximum value
from your image sensor is 4095, and the white level of a particular image
were 2047, then you also have a factor of two headroom. Were any part of
the image to exceed the maximum value - specular highlight for example -
we would say the value was clipped.

"Digitisation" is simply the process of converting analog to digital.

David




All times are GMT +1. The time now is 09:37 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
PhotoBanter.com