A Photography forum. PhotoBanter.com

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PhotoBanter.com forum » Photo Equipment » 35mm Photo Equipment
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Convolution, deconvolution and out of focus images.



 
 
Thread Tools Display Modes
  #1  
Old December 10th 04, 12:55 AM
Dave
external usenet poster
 
Posts: n/a
Default Convolution, deconvolution and out of focus images.

First, before asking this, I should add I have just come back from the
pub and had quite a few beers, so take that into account !!


I'm an electrical engineer/scientist, with an interest in protography. I
know in theory what you actualy record on an instrument (e.g.
oscilloscope, but possibly a camera ???) is the convolution of the real
signal and the impulse response of the system.

Measured = Real * Impulse_response

where * = convolution.

But in theory (but far less so in practice), knowing the 'impulse
response' of your system and what you record, it is possible to perform
deconvolution to calculate the real signal is - despite the fact you
have not recorded it.

This got me thinking about whether you can correct for out of focus
images, or imperfect lenses by knowing their impulse response. This
might (or might not) be you call the MTF. I think they are related.

So assuming you have a poor lens, and so you take a poor photograph of a
scene. can you measure the properties of that lens (find its impulse
response) and deconvolve the recorded image with the impulse response of
the lens to find out what the real picture is, without the distorsions,
so negating the effect of your poor lens?

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?




  #4  
Old December 10th 04, 02:56 AM
Michael A. Covington
external usenet poster
 
Posts: n/a
Default


"Dave" wrote in message ...
I'm an electrical engineer/scientist, with an interest in protography. I
know in theory what you actualy record on an instrument (e.g.
oscilloscope, but possibly a camera ???) is the convolution of the real
signal and the impulse response of the system.

Measured = Real * Impulse_response

where * = convolution.

But in theory (but far less so in practice), knowing the 'impulse
response' of your system and what you record, it is possible to perform
deconvolution to calculate the real signal is - despite the fact you have
not recorded it.

This got me thinking about whether you can correct for out of focus
images, or imperfect lenses by knowing their impulse response. This might
(or might not) be you call the MTF. I think they are related.


It is the point spread function. You're familiar with the 1-dimensional
version of this, in signal processing, and of course we need the
2-dimensional version.

One special case in which it can sometimes be done is astronomy. Every star
is a nearly perfect point source, so you get lots of good measures of the
point spread function in every picture. You should be able to deconvolve
the picture, i.e., apply the inverse convolution, and get back an original
image much sharper than what you actually have.

The problem as I understand is that the solution is unstable. That is, when
solving for the inverse function, you're in a situation where extremely tiny
errors in the data (noise, film grain) will greatly throw off the
deconvolution. There are various strategies... for one of them, look up
"maximum-entropy deconvolution." This assumes a fair bit about what the
original picture should look like, and in terrestrial photography, one can't
assume much.

A simple alternative is to just assume that the blur is a Gaussian blur
(bell-shaped point spread function). In this particular situation, the
"sharpen" function of typical image software (or, better, "unsharp mask"
with appropriate parameters) is the inverse convolution.

There's more about this in my book, and still more in several newer books
about image processing.


--
Clear skies,

Michael A. Covington
Author, Astrophotography for the Amateur
www.covingtoninnovations.com/astromenu.html


  #5  
Old December 10th 04, 04:07 AM
McLeod
external usenet poster
 
Posts: n/a
Default

On Fri, 10 Dec 2004 00:55:04 +0000, Dave wrote:

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?


Not that off the wall. Right now the Nikon Capture software turns the
10.5 mm fisheye lens into a rectillinear lens. It goes a little soft
at the edges due to realigning the pixels but it is an amazing piece
of work anyway. The same software also maps all the dust spots and
bad pixels and interpolates them as well, so I think the kind of thing
you're talking about may not be really far off.
  #6  
Old December 10th 04, 04:07 AM
McLeod
external usenet poster
 
Posts: n/a
Default

On Fri, 10 Dec 2004 00:55:04 +0000, Dave wrote:

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?


Not that off the wall. Right now the Nikon Capture software turns the
10.5 mm fisheye lens into a rectillinear lens. It goes a little soft
at the edges due to realigning the pixels but it is an amazing piece
of work anyway. The same software also maps all the dust spots and
bad pixels and interpolates them as well, so I think the kind of thing
you're talking about may not be really far off.
  #7  
Old December 10th 04, 09:59 AM
Ken Tough
external usenet poster
 
Posts: n/a
Default

Apparently Dave wrote:

So assuming you have a poor lens, and so you take a poor photograph of a
scene. can you measure the properties of that lens (find its impulse
response) and deconvolve the recorded image with the impulse response of
the lens to find out what the real picture is, without the distorsions,
so negating the effect of your poor lens?

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?


Good one. UK's Qinetiq (defense research establishment) has taken
out patents on new technology based on a similar concept. Their
problem has been working with stuff outside the visible spectrum
and the difficulty in making good lenses for that. They see
application across things like cheap phone-cam technology, where
putting $ into the processing and taking them away from the optics
will be more cost effective.

--
Ken Tough
  #8  
Old December 11th 04, 01:57 AM
Al Denelsbeck
external usenet poster
 
Posts: n/a
Default

Dave wrote in :

First, before asking this, I should add I have just come back from the
pub and had quite a few beers, so take that into account !!


I'm an electrical engineer/scientist, with an interest in protography. I
know in theory what you actualy record on an instrument (e.g.
oscilloscope, but possibly a camera ???) is the convolution of the real
signal and the impulse response of the system.

Measured = Real * Impulse_response

where * = convolution.

But in theory (but far less so in practice), knowing the 'impulse
response' of your system and what you record, it is possible to perform
deconvolution to calculate the real signal is - despite the fact you
have not recorded it.

This got me thinking about whether you can correct for out of focus
images, or imperfect lenses by knowing their impulse response. This
might (or might not) be you call the MTF. I think they are related.

So assuming you have a poor lens, and so you take a poor photograph of a
scene. can you measure the properties of that lens (find its impulse
response) and deconvolve the recorded image with the impulse response of
the lens to find out what the real picture is, without the distorsions,
so negating the effect of your poor lens?

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?



This has actually been done, and if I remember right it's
accomplished through fourier processing. I've seen the results for high-
magnification things like photo micrography.

What was needed was a guideline portion of the image - a fuzzy spot
that should have been tightly-focused spot, or a streak that should be a
line. Given that, the programs were able to reprocess the image to account
for the deconvolution.

Part of the problem is, you have differing amounts of effect because
your original subject is usually three-dimensional. Think depth-of-field -
you may *want* the background out-of-focus. Working only from the resulting
image, the process has no way of knowing what portion of the image should
be considered in the proper focal plane, and what portion is soft simply
because the lens isn't focused there.

Variations of this have been used extensively by NASA, and fairly
often by law-enforcement to obtain better images from poor quality
surveillance cameras.


- Al.

--
To reply, insert dash in address to match domain below
Online photo gallery at www.wading-in.net
  #9  
Old December 11th 04, 01:57 AM
Al Denelsbeck
external usenet poster
 
Posts: n/a
Default

Dave wrote in :

First, before asking this, I should add I have just come back from the
pub and had quite a few beers, so take that into account !!


I'm an electrical engineer/scientist, with an interest in protography. I
know in theory what you actualy record on an instrument (e.g.
oscilloscope, but possibly a camera ???) is the convolution of the real
signal and the impulse response of the system.

Measured = Real * Impulse_response

where * = convolution.

But in theory (but far less so in practice), knowing the 'impulse
response' of your system and what you record, it is possible to perform
deconvolution to calculate the real signal is - despite the fact you
have not recorded it.

This got me thinking about whether you can correct for out of focus
images, or imperfect lenses by knowing their impulse response. This
might (or might not) be you call the MTF. I think they are related.

So assuming you have a poor lens, and so you take a poor photograph of a
scene. can you measure the properties of that lens (find its impulse
response) and deconvolve the recorded image with the impulse response of
the lens to find out what the real picture is, without the distorsions,
so negating the effect of your poor lens?

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?



This has actually been done, and if I remember right it's
accomplished through fourier processing. I've seen the results for high-
magnification things like photo micrography.

What was needed was a guideline portion of the image - a fuzzy spot
that should have been tightly-focused spot, or a streak that should be a
line. Given that, the programs were able to reprocess the image to account
for the deconvolution.

Part of the problem is, you have differing amounts of effect because
your original subject is usually three-dimensional. Think depth-of-field -
you may *want* the background out-of-focus. Working only from the resulting
image, the process has no way of knowing what portion of the image should
be considered in the proper focal plane, and what portion is soft simply
because the lens isn't focused there.

Variations of this have been used extensively by NASA, and fairly
often by law-enforcement to obtain better images from poor quality
surveillance cameras.


- Al.

--
To reply, insert dash in address to match domain below
Online photo gallery at www.wading-in.net
  #10  
Old December 11th 04, 07:25 PM
Dave
external usenet poster
 
Posts: n/a
Default

Al Denelsbeck wrote:
Dave wrote in :


First, before asking this, I should add I have just come back from the
pub and had quite a few beers, so take that into account !!


I'm an electrical engineer/scientist, with an interest in protography. I
know in theory what you actualy record on an instrument (e.g.
oscilloscope, but possibly a camera ???) is the convolution of the real
signal and the impulse response of the system.

Measured = Real * Impulse_response

where * = convolution.

But in theory (but far less so in practice), knowing the 'impulse
response' of your system and what you record, it is possible to perform
deconvolution to calculate the real signal is - despite the fact you
have not recorded it.

This got me thinking about whether you can correct for out of focus
images, or imperfect lenses by knowing their impulse response. This
might (or might not) be you call the MTF. I think they are related.

So assuming you have a poor lens, and so you take a poor photograph of a
scene. can you measure the properties of that lens (find its impulse
response) and deconvolve the recorded image with the impulse response of
the lens to find out what the real picture is, without the distorsions,
so negating the effect of your poor lens?

You probably think I'm either drunk (true), a mad scientist (also true),
but does anyone reading this have a clue what I am on about?




This has actually been done, and if I remember right it's
accomplished through fourier processing. I've seen the results for high-
magnification things like photo micrography.


This does not surprise me. The convolution of A and B can be obtained by
taking the Fourier transforms of A and B and multiplying them together.

What was needed was a guideline portion of the image - a fuzzy spot
that should have been tightly-focused spot, or a streak that should be a
line. Given that, the programs were able to reprocess the image to account
for the deconvolution.


I was hoping you could do better than that, without so much information.
Someone mentioned the fisheye lens software.


Part of the problem is, you have differing amounts of effect because
your original subject is usually three-dimensional. Think depth-of-field -
you may *want* the background out-of-focus. Working only from the resulting
image, the process has no way of knowing what portion of the image should
be considered in the proper focal plane, and what portion is soft simply
because the lens isn't focused there.


Yes, I see that. The original image has 3D data, the imperfect lens
produces 3D data, but the film plane captures only 2D data. So some
information has been lost.

Variations of this have been used extensively by NASA,


As someone said, there you have point sources.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 02:25 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PhotoBanter.com.
The comments are property of their posters.