2009-07-18

Why Moore’s Law Certainly DOES Apply to Digital Photography

Response to: Why Moore’s Law Does Not Apply to Digital Photography

"...At least with respect to expecting the number of mega pixels to double for a given sensor size (full 35 mm frame), this will not happen. We are very near the limit right now. ..."

Another expert pontificating about how there's a hard-and-fast limit to what humans can achieve.

Mr. Maxwell, meet Mr. Clarke:

Arthur C. Clarke formulated the following three "laws" of prediction:

1. When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
2. The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
3. Any sufficiently advanced technology is indistinguishable from magic.


So, how will science and technology blast past the physical limits of diffraction?

Well duh, that's a stupid question. Of course nobody knows! It's called "The Future" because it's, well, in the future.


But here's what I can see coming, and this is just by thinking about it for ten minutes.

5 years: The camera will take several images using various combinations of settings to gather additional information (dozens of images). It will then compensate for any movement of objects within the frame (using MPEG technology), and then re-combine the processed image elements to provide maximum image quality while achieving the artistic goals of the photographer. It won't always work, but for still images this sort of process can generate acceptable quality at 50+ Mpix images in a $100 digital camera. You watch.

10 years: The above-mentioned variable settings will be built-into the sensor itself. It won't be a dumb sensor. They'll call them 'Smart Sensors' or similar. They've have built-in lenses and tunable colour filters per pixel. They'll have built-in complexities to skip around the edges of the laws of physics. The image processor will be built-into the sensor (underneath). 100+ Mpix images, given certain restrictions to avoid artifacts, from mid-range cameras.

15 years: The sensor and processor will actually examine the diffraction-generated Airy disks, look for the obvious oval shape of the outer rings using extreme bit depth. The sensors will be actively cooled to extremely low temperatures, for ultra short periods (avoiding power and condensation issues), to eliminate noise. The camera's processor will then model the entire diffraction process in reverse to calculate the incoming diffraction-free signal. The camera will have more processing power than your blah-blah-blah... This will be the big breakthrough that will allow 250+ Mpix diffraction-free images in the sweet-spot $2k price range (2009 dollars).

20 years: The sensor will no longer be a planer chip. It'll be something akin to a field of monopole antennas. The antenna elements will swirl around in little circles to probe the electrical field of the incoming light. They'll be able to detect sub-diffraction patterns and the processor will be able to reconstruct the incoming diffraction free image. Giga-pixel images in a camera the size of a Large Format. They'll be very expensive... ...until the price comes down, and then they'll be affordable (LOL).


That's just ten minutes brainstorming. Imagine what magic will occur when people actually get paid to work on the problem for the next fifteen to twenty years or so.


A common error is to overestimate change in the mid-term, and vastly underestimate change in the long term.


In summary, Moore's Law certainly DOES apply to digital cameras. Big time.

And there won't be any stopping it.


Update: A more modest rebuttal covering the near term.