A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Digital Photography » Digital Photography
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Digital vs Film Resolution



 
 
Thread Tools Display Modes
  #91  
Old October 2nd 04, 06:34 PM
Dave Martindale
external usenet poster
 
Posts: n/a
Default

bob writes:

I know unsharp-mask is originally a film
technique, but how commonly is it actually done in 35mm and MF?


I don't think it is done in smaller formats. To make an USM, you "contact
print" one negative onto another. A spacer is used between the negatives,
with a difuser placed on top of the original. The light shines through the
difuser and scatters through the negative onto the film. The thickness of
the spacer determines the strength of the effect.


The thickness of the spacer determines how blurred the mask is, which
determines the difference between fine detail that's not affected by
the mask and coarse detail that is.

You also have control over the contrast of the mask, determined by
development time of the mask film.

Film unsharp masking isn't quite the same as the Photoshop effect of the
same name. In film, unsharp mask is used to reduce large-detail
contrast while leaving small-detail contrast. The Photoshop effect
leaves large-detail contrast unaffected while boosting small-detail
contrast.

Dave
  #92  
Old October 3rd 04, 02:23 PM
Bart van der Wolf
external usenet poster
 
Posts: n/a
Default


"Jonathan Wilson" wrote in message
...
SNIP
I'm curious... (note I havent shot 35mm since I was 8 and
I'm 37 now, lol)

If I took a photo with 35mm film and with say a 300D
(6MP) then produced 3 prints of say the top corner
(1/4 size) of the images with...

1, the original 35mm blown up to say A3
2, the 300D blown up to A3
3, a scan of the 35mm (high res scan) blown up to A3

Which would have the aparent best quality, especially if
the software didnt reduce the output for the digital to a
lower DPI and actually used some upscaleing system
(aka Qimage)


Hard to predict because that , amongst others, depends on the skill of
the photographer/operator, and on the definition of quality.
1 doesn't allow much control, except over general colorbalance, 2
depends on the postprocessing an upscaling quality, and 3 depends on
the scanner/operator and postprocessing.

Would the original end up showing grain where as the scan
and printing software would actually hide the grain due to its
printing/upscaleing/lancos or some other combination.


In the case 1, the only way graininess can be controlled is by using a
diffuse lightsource enlarger. It also changes contrast depending on
density (and paper choice combined with processing).
Both cases 2 and 3 allow to reduce noise/graininess by using noise
reduction software, which can be very effective. The upscaling doesn't
help noise, it only increases its size, although there are methods
that simulate edge detail at the expense of fine detail/gradients.

Also would the digital with upscaling produce an image equal
to one or both or none of the above. Would the result be
subjective as some bits of the resultant print might look better
while others would look worse compared to the film original
enlargment or the digital scan enlargment.


Upon close inspection it will be apparent that the DSLR will lack real
resolution by comparison, but has good edge contrast and low noise by
itself (even without noise reduction). From an appropriate distance,
the difference may be hard to notice. It'll require a 1Ds or the Mark
2 version to really meet/exceed the resolution of low ISO film
(assuming quality optics and no camera shake and a stationary
subject).

The reason why I wonder is that, and I know its not how to
produce good quality prints, the current 1 hour (agfa) systems
actually have a very low resolution... and where film is involved
they develop the film, scan the result, then print from the scan
at 380dpi for 6/4's and 260dpi for 8/12's or perhaps peoples
idea of perfection has reduced significently....


Acceptable quality, but then we are lowering standards, varies with
the goal. Perfection is on the other side of the scale and mostly
equipment / skill limited. It is mainly the intended application of
the images that pose the practical limits on the amount of perfection
we need.

It should be noted that the prints dpi is not actually the "on
paper" dpi only the intermediate files processed dpi as the
system does some other things to that file before it scans over
the paper... while they dont specify what they do, i'm guessing
its similar to the way Qimage upscales before sending the data
to the actual printer (or in this case the LED/Lasers)


Probably similar, yes. But different hardware may also allow to
exploit it. For example, although inkjet printers need to dither to
produce intermediary ink colors, where RGB printers don't, they also
allow to produce higher resolution edges (e.g. text).

Enquiring about the apparent "low res" of the agfa I found out that
the max res for all makes of mini labs are 400dpi; none actually
resolve to 600dpi so potentially a home printer exceeds the out put

of
the mini labs.


For edges yes, but it's harder to produce smooth/light gradients. Do
note that 300/400 ppi is close to the visual acuity of the human eye,
but there is a noticable resolution difference with inkjets. But there
are also other issues, like color gamut, which make a 1-on-1
comparison difficult. Cost/output speed is another factor to consider,
besides quality.

Bart

  #93  
Old October 3rd 04, 02:23 PM
Bart van der Wolf
external usenet poster
 
Posts: n/a
Default


"Jonathan Wilson" wrote in message
...
SNIP
I'm curious... (note I havent shot 35mm since I was 8 and
I'm 37 now, lol)

If I took a photo with 35mm film and with say a 300D
(6MP) then produced 3 prints of say the top corner
(1/4 size) of the images with...

1, the original 35mm blown up to say A3
2, the 300D blown up to A3
3, a scan of the 35mm (high res scan) blown up to A3

Which would have the aparent best quality, especially if
the software didnt reduce the output for the digital to a
lower DPI and actually used some upscaleing system
(aka Qimage)


Hard to predict because that , amongst others, depends on the skill of
the photographer/operator, and on the definition of quality.
1 doesn't allow much control, except over general colorbalance, 2
depends on the postprocessing an upscaling quality, and 3 depends on
the scanner/operator and postprocessing.

Would the original end up showing grain where as the scan
and printing software would actually hide the grain due to its
printing/upscaleing/lancos or some other combination.


In the case 1, the only way graininess can be controlled is by using a
diffuse lightsource enlarger. It also changes contrast depending on
density (and paper choice combined with processing).
Both cases 2 and 3 allow to reduce noise/graininess by using noise
reduction software, which can be very effective. The upscaling doesn't
help noise, it only increases its size, although there are methods
that simulate edge detail at the expense of fine detail/gradients.

Also would the digital with upscaling produce an image equal
to one or both or none of the above. Would the result be
subjective as some bits of the resultant print might look better
while others would look worse compared to the film original
enlargment or the digital scan enlargment.


Upon close inspection it will be apparent that the DSLR will lack real
resolution by comparison, but has good edge contrast and low noise by
itself (even without noise reduction). From an appropriate distance,
the difference may be hard to notice. It'll require a 1Ds or the Mark
2 version to really meet/exceed the resolution of low ISO film
(assuming quality optics and no camera shake and a stationary
subject).

The reason why I wonder is that, and I know its not how to
produce good quality prints, the current 1 hour (agfa) systems
actually have a very low resolution... and where film is involved
they develop the film, scan the result, then print from the scan
at 380dpi for 6/4's and 260dpi for 8/12's or perhaps peoples
idea of perfection has reduced significently....


Acceptable quality, but then we are lowering standards, varies with
the goal. Perfection is on the other side of the scale and mostly
equipment / skill limited. It is mainly the intended application of
the images that pose the practical limits on the amount of perfection
we need.

It should be noted that the prints dpi is not actually the "on
paper" dpi only the intermediate files processed dpi as the
system does some other things to that file before it scans over
the paper... while they dont specify what they do, i'm guessing
its similar to the way Qimage upscales before sending the data
to the actual printer (or in this case the LED/Lasers)


Probably similar, yes. But different hardware may also allow to
exploit it. For example, although inkjet printers need to dither to
produce intermediary ink colors, where RGB printers don't, they also
allow to produce higher resolution edges (e.g. text).

Enquiring about the apparent "low res" of the agfa I found out that
the max res for all makes of mini labs are 400dpi; none actually
resolve to 600dpi so potentially a home printer exceeds the out put

of
the mini labs.


For edges yes, but it's harder to produce smooth/light gradients. Do
note that 300/400 ppi is close to the visual acuity of the human eye,
but there is a noticable resolution difference with inkjets. But there
are also other issues, like color gamut, which make a 1-on-1
comparison difficult. Cost/output speed is another factor to consider,
besides quality.

Bart

  #94  
Old October 3rd 04, 02:33 PM
jjs
external usenet poster
 
Posts: n/a
Default

"Jonathan Wilson"

If I took a photo with 35mm film and with say a 300D
(6MP) then produced 3 prints of say the top corner
(1/4 size) of the images with...

1, the original 35mm blown up to say A3
2, the 300D blown up to A3
3, a scan of the 35mm (high res scan) blown up to A3

Which would have the aparent best quality, especially if
the software didnt reduce the output for the digital to a
lower DPI and actually used some upscaleing system
(aka Qimage)


First, upscaling (interpolation) does not add detail, nor enhance it. So
forget that. Now define "apparent best quality". You see, you ask an
impossible question. Take one instance - sharpness - and the term
"accutance" which means "apparent sharpness" for example: it does not
require high resolution, just specific qualities that give an appearance of
sharpness.


  #95  
Old October 3rd 04, 02:43 PM
jjs
external usenet poster
 
Posts: n/a
Default

"Bart van der Wolf" wrote in message
...

In the case 1, the only way graininess can be controlled is by using a
diffuse lightsource enlarger. It also changes contrast depending on
density (and paper choice combined with processing).
Both cases 2 and 3 allow to reduce noise/graininess by using noise
reduction software, which can be very effective.


In many subjects fine grain defeats *accutance to a remarkable degree. Grain
is your friend in that regard, even if it is not apparent. That is, unless
we are talking about recon photography, and we aren't. When consumer
digital cameras (not current scanning backs) equal medium format (which I
mean to be a minimum of 6x6cm) I predict there will be a rise in popularity
of an 'add grain' filter - I mean one beyond "add noise". People will want
a digital filter that actually creates film-like grain to simulate boundary
effects without USM.

*I know that you understand the term Accutance, Bart, but for the rest:
Accutance is 'perceived sharpness', or the impression of sharpness, and not
lp/mm metrics.


  #96  
Old October 3rd 04, 02:58 PM
Dick
external usenet poster
 
Posts: n/a
Default

I have commercial CD's that are as old as Sony's first CD player. At
least 20 years old. I see no signs of deterioration. I would expect
them to last at least another 20 years. Are home-burned CD's inferior
to commercial CD's?

On Sun, 03 Oct 2004 07:31:45 -0500, wrote:

"Tom Nakashima" wrote:

Always back up on CDs, thought everyone does that...guess not.


You might want to search on CD longevity. When I started out, I was under
the naive impression that backing up to two sets of CDs would be a good
backup strategy. If you only need a backup that lasts a few years, it is.
For longer term archiving, CDs are worthless.

I currently back up to regular IDE hard drives that are unplugged in between
backups.


  #97  
Old October 3rd 04, 03:00 PM
Bruce Murphy
external usenet poster
 
Posts: n/a
Default

Dick LeadWinger writes:

I have commercial CD's that are as old as Sony's first CD player. At
least 20 years old. I see no signs of deterioration.


WHereas quite a lot of early CDs did see various delamination and
oxidisation problems.

I would expect
them to last at least another 20 years. Are home-burned CD's inferior
to commercial CD's?


Yes, they're a liquid dye layer rather than a metal film embedded in
plastic.

B
  #98  
Old October 3rd 04, 04:55 PM
Bart van der Wolf
external usenet poster
 
Posts: n/a
Default


"jjs" wrote in message
...
"Bart van der Wolf" wrote in message
...

SNIP
*I know that you understand the term Accutance, Bart, but for
the rest: Accutance is 'perceived sharpness', or the impression of
sharpness, and not lp/mm metrics.


Correct, on both counts ;-)

And to complicate matters, the human eye does that
(http://webvision.med.utah.edu/KallSpatial.html#csf) even before the
brain starts inventing/expecting things like in this fill-in-the
blanks example:
http://www.xs4all.nl/~bvdwolf/temp/Triangle-or-not.gif
There is of course no triangle, but we want to see it due to clues
from the cut-out sectors. We may even briefly imagine the "triangle"
is darker than pure white. IMO that can also help apparent (not real)
resolution when we add noise/grain to an image, as long as the signal
to noise ratio is high enough.

Bart

  #99  
Old October 3rd 04, 04:55 PM
Bart van der Wolf
external usenet poster
 
Posts: n/a
Default


"jjs" wrote in message
...
"Bart van der Wolf" wrote in message
...

SNIP
*I know that you understand the term Accutance, Bart, but for
the rest: Accutance is 'perceived sharpness', or the impression of
sharpness, and not lp/mm metrics.


Correct, on both counts ;-)

And to complicate matters, the human eye does that
(http://webvision.med.utah.edu/KallSpatial.html#csf) even before the
brain starts inventing/expecting things like in this fill-in-the
blanks example:
http://www.xs4all.nl/~bvdwolf/temp/Triangle-or-not.gif
There is of course no triangle, but we want to see it due to clues
from the cut-out sectors. We may even briefly imagine the "triangle"
is darker than pure white. IMO that can also help apparent (not real)
resolution when we add noise/grain to an image, as long as the signal
to noise ratio is high enough.

Bart

  #100  
Old October 3rd 04, 07:07 PM
Jer
external usenet poster
 
Posts: n/a
Default

wrote:


I currently back up to regular IDE hard drives that are unplugged in between
backups.

--
Eric
http://canid.com/



Interesting someone finally mentioned this method. Hard drives are like
eggs - cheaper by the dozen. I have 20+ hard drives dedicated to
archival usage stored in a data safe, and all are reformatted on their
5th birthday. The cost of this method is comparatively less than
others, significantly more robust, and a huge time saver. In the odd
event one won't spin up (which happened once) it was sent to a data
retrieval firm. Their technology and methods will be around a lot
longer than I'll be. I refuse to **** around with anything less and I
dropped my last CD on the floor a long time ago.

--
jer email reply - I am not a 'ten'
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Focal plane vs. leaf shutters in MF SLRs KM Medium Format Photography Equipment 724 December 7th 04 09:58 AM
Digital quality (vs 35mm): Any real answers? Toralf Digital Photography 213 July 28th 04 06:30 PM
Digital Imaging vs. (Digital and Film) Photography Bob Monaghan Medium Format Photography Equipment 9 June 19th 04 05:48 PM
The first film of the Digital Revolution is here.... Todd Bailey Film & Labs 0 May 27th 04 08:12 AM
Which is better? digital cameras or older crappy cameras thatuse film? Michael Weinstein, M.D. In The Darkroom 13 January 24th 04 09:51 PM


All times are GMT +1. The time now is 10:24 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.