Shu-Hsien,
> ...I > won't answer any comments on this post, especially those challenging > the credential issue:
Why not simply keep the post technical instead of bring up the credential issue again, or bragging about this project or that? Most of it simply isn't verifiable, much less applicable, anyway.
> One basic concept I kept stressing is that the conversion process > (commonly termed as AFE, Analog Front End) from image sensor signals > to raw image file is not done just through an A/D converter.
You (Finney) said that people were mistaken that the raw data is not from the sensor, and though true, no one ever said that it was. No one ever said other than that the raw data was specifically the data out of the A/D converter, nothing more.
No one ever said there was only an A/D either. An analog front end is inherent in any design of this nature. How complicated that AFE is doesn't change the fact that the raw data is still the same...it's the data out of the A/D, and the A/D is the only place the raw data can come from.
Obviously, there can be varying degrees of complication in the AFE, going from simple offset and gain (which is pretty much mandatory in any implementation, and probably the most common implementation for an A/D system), to something far more complicated. That does not mean it HAS to be complicated though, or the actual benefits gained from more complicated methods.
Most of the basic functions done in these chips, has typically been done using multiple components, so, for the most part, these chips are simply an integrated solution, as opposed to a solution using separate components. These chips integrate the functions into a smaller package, and therefore reduces manufacturing costs and board space, reduce power consumption...which is all a good thing, and pretty much a typical electriconic evolution. No news there.
These chips are doing mostly some level of noise filtering, gain and offset correction, and cl&ing...and the ones referenced seem to do so with static values. Clearly what "processing" they do is quite different than what is typically though of as "image" processing as it relates to use of digital cameras, like Bayer pattern reconciliation, tonal correction, setpoints etc.
Really the only thing that can be called processing in the chips shown here is the noise filtering (CDS)...and even at that, calling that processing is a stretch, and not all cameras have CDS. It's a double s&ler, where it simply takes two s&les, and averages them. This would really be called a simple filter. Gain (PGA), offset and cl&ing is not really "processing", even in a very loose use of the word. One can call just about anything processing, and by this loose definition, a decoupling capacitor, or a resistor divider, or even the sense &lifier on the imaging sensor would be processing as well...and those are not typically though of as "processing".
> In the diagrams, PGA stands for programmable gain &lifier, and VGA > is variable gain &lifer. They are used mostly for basic noise > reduction, black level calibration, and white balance, etc, at the > analog stage.
The PGA and VGA are typically used to apply gain (same as multiplication) to the signal level to match the output range of the CCD with the input range of the A/D. They have nothing to do with "basic noise reduction", black level calibration. "Basic noise reduction" is the function of the CDS block, and black level calibration (shift) is simply an offset that is added by the "black level shift" (or "offset", depending on which block diagram) block.
Say the CCD has an output voltage swing of 2V, and the A/D has an input range of 6V. The VGA/PGA simply "map" (apply uniform gain) so that an input of 2V gets "seen" by the A/D as 6V, and that 1V gets "seen" by the A/D as 3V etc. One of them, the National part, allows it's PGA to be used for white balance, as it has the tables for that to work. I don't believe the others do.
But, your point in this post, seems to be that we, as general users of produced equipment, should somehow care about these things, as it relates to specific implementations. I see many issues with that. One is that we may not be able to find out what chips are being used in a particular camera.
> even with the same image sensor, different AFEs will give you > different quality of raw image data.
I agree, but that's not the whole story. There is a LOT to the actual implementation of the chip that can make it work very well, or very poorly, so just having a particular chip in a particular camera does not guarantee a certain level of performance. Most of what is done in these integrated chips has to be done anyway, just using more components. Bottom line is, every implementation will be different, whether these chips are used or not, and using these chips does not mean the result will be better. Also, the significance of the difference has yet to be established, and clearly, this significance (if any) depends on what you want to use the raw data for.
I also do not see there being any possibility for a level of modularity in digital cameras as you appear to be suggesting, above what we already have today. The on camera hardware and software and camera are highly integrated. This is very much different than being able to separate a CD transport mechanism from the D/A, as there is a clear boundry there...the digital boudary (which had many of it's own problems when first decoupled).
Technically, we already have a boundary, and it's the raw image data. And, you can buy after market Bayer pattern processing software, as well as a lot of other image processing software. But in reality, the equivelent would be the analog signal off the sensor, and that isn't going to happen in the forseeable future, if ever at all. The requirements for each sensor is different, and as such, I believe, really prohibits any real level of modularity. Some have suggested being able to replace imaging sensors when newer/"better" ones come out...but that is pretty much impossible for many reasons.
If anything, things will have higher integration (as is typical in the electronics industry), like putting the AFE on the image sensor (more likely on CMOS image sensors), if not the entire digital processing engine as well, or more than likely, just two components, the imaging sensor and "a" chip that integrates the AFE, digital image processing, storage, host communications etc., or some partitioning that makes sense, as there are a few different possible approaches. There are already companies working on this very thing, and higher integration typically means less modularity.
> After reading some call for real answers here, I am thinking of > starting a loosely coupled discussion group to focus more on the > technology side of photography. I will not mind sharing industry news > and whisper there. ... Some may warn this kind of discussion group > will lead to a cult like status...
Well, Finney...ironically, though you accused me of this, it seems like it's you who is "looking for a following"!
There are already many discussion groups/newsgroups/mailing lists that focus on "the technical side of photography". They are all over the Internet. For those interested in Digital B&W photography,
is an excellent group. Even the Leben Epson mailing list list has had some real experts in the field of digital imaging on it, and may still, as had the scanners newsgroup. There are many people, including experts, who have been discussing this stuff in open forums, for many years, on the Internet. So, if you aren't simply looking for a following, and you want to actually discuss (not pontificate) with people who are actually in the digital imaging field, and you can take the confrontation, I'd suggest you poke around a bit on some of the already existing lists.
Austin