G
Guest
> > You say " The Foveon 'technology' does not necessarily allow you to > have more photosites than other "technologies" in either the same unit > area, or in overall area. In fact, it's probably less. The number of > photosites is limited by the cell size, and the cell size is typically > limited by noise.<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">•<font color="ff0000">• is self contradicory.
I doubt it, but let's see what you believe it says...
> If as you say, the cell size is limited by > noise - which I can accept -
It is not physically limited by noise, at least in our application, it is PRACTICALLY, as in usably, limited by noise.
> then the more information gathering you > can pack into a single photo site the better off you are.
I don't know what you are saying. It's a physical issue. The larger the photosites are, the more photons they capture. Having multiple sensing units in a single photosite does NOT increase your resolution as they are both seeing the same "column" of light.
> If it takes > three "normal" photo sites to produce one pixel,
That is what you are not understanding. It does not. It takes FOUR (Bayer pattern sensors are RGBG, two G's for additional contrast) photosites to produce FOUR pixels. It is the COLOR (as in chrominance) information that is interpolated over all four pixels, the luminosity. Each photosite in a Bayer pattern imaging sensor has unique physical information that is uniquely from that spatially arrange photosite.
> and if you can > produce one pixel with one photo site the area required is cut by > approximately a factor of three. In the case of the detector used in > the Sigma SD9, there are 3,429,216 photosites (2268 x 1512), hence > pixels, in an area of 20.7 x 13.8 mm (0.0091269841 mm on a side per > photo site) To achieve this same density of photo sites using the old > single layer technology, and three photo sites per pixel, would > require a photo site size considerably less than 9 microns which would > increase noise dramatically.
No, that is a misunderstanding on your part on how Bayer pattern sensors work. If the Bayer pattern sensor has 2268 x 1512 photosites, it produces, through interpolating color information, 2268 x 1512 PIXELS of image information.
Color is NOT as important as you think it is, because of how our eyes work. Edge information is what is important, and with the Bayer pattern sensor, you DO get almost the exact same amount of edge information as you would if you were s&ling all three colors at the same photosite.
There is no increase in density because of the Foveon sensor over the Bayer pattern sensor, if they both have the same number of photosites, and the photosites are the same size. If you have a 3M Foveon, you get the same number of pixels as if you have a 3M Bayer pattern imaging sensor.
The Foveon sensor MAY produce marginally better color information than a Bayer pettern sensor, and that is VERY image dependant...but interestingly enough, there are issues with the Foveon sensors and color. Their color fidelity is actually pretty poor, as is their low light capability. Both of which I stated a long time ago would be issues with this type of sensor architecture.
Austin
I doubt it, but let's see what you believe it says...
> If as you say, the cell size is limited by > noise - which I can accept -
It is not physically limited by noise, at least in our application, it is PRACTICALLY, as in usably, limited by noise.
> then the more information gathering you > can pack into a single photo site the better off you are.
I don't know what you are saying. It's a physical issue. The larger the photosites are, the more photons they capture. Having multiple sensing units in a single photosite does NOT increase your resolution as they are both seeing the same "column" of light.
> If it takes > three "normal" photo sites to produce one pixel,
That is what you are not understanding. It does not. It takes FOUR (Bayer pattern sensors are RGBG, two G's for additional contrast) photosites to produce FOUR pixels. It is the COLOR (as in chrominance) information that is interpolated over all four pixels, the luminosity. Each photosite in a Bayer pattern imaging sensor has unique physical information that is uniquely from that spatially arrange photosite.
> and if you can > produce one pixel with one photo site the area required is cut by > approximately a factor of three. In the case of the detector used in > the Sigma SD9, there are 3,429,216 photosites (2268 x 1512), hence > pixels, in an area of 20.7 x 13.8 mm (0.0091269841 mm on a side per > photo site) To achieve this same density of photo sites using the old > single layer technology, and three photo sites per pixel, would > require a photo site size considerably less than 9 microns which would > increase noise dramatically.
No, that is a misunderstanding on your part on how Bayer pattern sensors work. If the Bayer pattern sensor has 2268 x 1512 photosites, it produces, through interpolating color information, 2268 x 1512 PIXELS of image information.
Color is NOT as important as you think it is, because of how our eyes work. Edge information is what is important, and with the Bayer pattern sensor, you DO get almost the exact same amount of edge information as you would if you were s&ling all three colors at the same photosite.
There is no increase in density because of the Foveon sensor over the Bayer pattern sensor, if they both have the same number of photosites, and the photosites are the same size. If you have a 3M Foveon, you get the same number of pixels as if you have a 3M Bayer pattern imaging sensor.
The Foveon sensor MAY produce marginally better color information than a Bayer pettern sensor, and that is VERY image dependant...but interestingly enough, there are issues with the Foveon sensors and color. Their color fidelity is actually pretty poor, as is their low light capability. Both of which I stated a long time ago would be issues with this type of sensor architecture.
Austin