Archived posting to the Leica Users Group, 2002/02/10

[Author Prev] [Author Next] [Thread Prev] [Thread Next] [Author Index] [Topic Index] [Home] [Search]

Subject: [Leica] RE: Digital and lens resolution was Re: APD out
From: Jim Brick <jim@brick.org>
Date: Sun, 10 Feb 2002 16:40:55 -0800
References: <B88BF831.AE2%dgp@btconnect.com>

At 06:50 AM 2/10/2002 -0800, Frank Filippone wrote:
>As much as I would like to believe what I am told, especially by very
>reputable sources,  a white paper, is usually, and in this case especially,
>self serving.  I am not an optics expert.
>Frank Filippone
>
>Don't take my word for it - Schneider has published an excellent white paper
>on why high resolution lenses are unsuited to digital applications.


The Nyquist frequency limit. Pixels are evenly spaced therefore exhibit a 
frequency. Pixel, no pixel, pixel, no pixel, on, off, on, off, on, off, hi, 
low, hi, low, hi, low, etc... exactly evenly spaced. Get the picture? The 
resolution capability of a lens in lines per millimeter is also a frequency 
(eg, MTF - Modulation Transfer Frequency).

Simply for discussion purposes, lets assume the following scenario:

Lets say you can see the separate lines on the transparency or negative at, 
say 100 lines per millimeter. And lets say that the spacing of pixels on a 
digital sensor is 50mm, that is a pixel at every 50 mm. This says that 
there is more data hitting the sensor than the sensor can collect. On every 
pixel, there is two lines of the 100 lpm hitting. This means that the 
maximum resolution that the digital sensor can capture is 50 lpm.

So lets say that you dumb down your 100 lpm lens so it will resolve only 50 
lpm. So now, a resolution line will land on each pixel and so you can 
capture and resolve 50 lpm. But wait...

What if each resolution line of the 50 lpm resolution chart (or that 
"distant" picket fence) lands directly in between each pixel on the sensor? 
Guess what... You get 0 lpm resolution because the sensor cannot see 
between pixels. And software cannot do it digitally because the information 
is simply not there. No picture. Missing fence! So this means you have to 
dumb down the lens even farther so that a resolution line (or individual 
picket) has to be broad enough to spread across several pixels. This way, 
it can't fall in the crack and be lost. So now we are down to perhaps 10 
lpm resolution in order for a digital sensor to be able to capture 
everything in the scene. Minus, of course, very fine detail, that fine 
detail resolution that we all pay dearly for when buying Leica lenses. But 
since the "digital" lens is of low resolution, no "fine" detail gets 
through to the sensor.

So, with 7 to 15 micron pixels, and four pixels required to make up a 
single color pixel
remember:
rg
gb

And no hope of making pixels smaller (they can be made down to 3 microns 
but the data is difficult to separate from the noise) and the simple fact 
that high resolution lenses will send more data to the sensor than be 
collected, and some data falling in the cracks and multiple pieces of data 
falling on one pixel, you end up with aliasing and other very strange 
artifacts that cannot be corrected in the hardware or software. The Nyquist 
limit says that it is impossible to collect frequency data with a sampling 
rate that is equal to or slower that the data rate (frequency.) You must 
have a sampling rate at least four times the data rate. You cannot collect 
a 100 lpm frequency data with a 100mm pixel spacing. You can use a lens 
with 25 lpm or less resolution with a pixel spacing of 100mm. So pixel 
spacing of 400mm might work with a 100 lpm lens. But... it takes four 
pixels to collect one piece of data. So it gets complicated.

The bottom line is that this phenomenon has been known for 100 years. It is 
something engineers deal with constantly in all disciplines, not just 
optics and sensors. This was (and still is) a giant problem with digital 
oscilloscopes and analyzers.

Astronomers cheat. They take four separate pictures, moving the sensor half 
a pixel (right, down, left, up) slightly for each photograph, thus 
capturing that data that falls in between pixels and that double
data that falls on one pixel. This is how they resolve double stars and the 
like.

What about the Nikon's and Canon's? They have frequency cutoff filters 
built in to the camera. They dumb down the lens. Otherwise their pictures 
would look like crap.

Jim

- --
To unsubscribe, see http://mejac.palo-alto.ca.us/leica-users/unsub.html

Replies: Reply from "Eric" <ericm@pobox.com> ([Leica] Re: Digital and lens resolution was Re: APD out)
In reply to: Message from David Prakel <dgp@btconnect.com> ([Leica] Digital and lens resolution was Re: APD out)