DPR Forum

Welcome to the Friendly Aisles!
DPRF is a photography forum with people from all over the world freely sharing their knowledge and love of photography. Everybody is welcome, from beginners to the experienced professional. Whether it is Medium Format, fullframe, APS-C, MFT or smaller formats. Digital or film. DPRF is a forum for everybody and for every format.
Enjoy this modern, easy to use software. Look also at our Reviews & Gallery!

Regarding Contax N digital and Leica digital back

> Thanks Austin. That was a very clear expanation and is much > appreciated.

My pleasure, John.

Regards,

Austin
 
Good old Austin,

Just a few Xilinx FPGA? Do you know I had been involving in design Xilinx Vertex II/III FPGA chips? Now this is not only a chip you have heard of, it is something you use a lot? Happy now? ;)

I will ignore you for now.


Hi Mike

Points well taken... actually this is the main reason I have never posted on contaxinfo... to keep it out of flame war! This is the internet thing. There are always half baked ideas, misinfo floating around and it is really tiresome to fight them off. This is why I prefer to have discussions in private emails. If you are very interested in knowing more details, how about we do chats in private?

I really do not care what people think of me, think how much I know, whether I am an expert, etc. Plus due to the NDA I signed, I can not openly discuss things I know.

Originally Sheu was asking me some usage problems regarding to ND hence my reply was mostly targeted torward that. My reply to him was not ntended to be answers to other digicam issues.

Your question regarding to A/D is very good. Yes, linearity and noise floor are two main issues but there are many other factors involved as well. The clock speed of reading a CCD sensor is fairly high now so clocking error is a big issue. Line noise is another issue. those discrete components around are also an issue. A/D conversion error itself is also an issue. You can not just directly use the data from the A/D stage as raw data. Most companies choose to compensate the data with calibration data of the sensor such as geometric noises, etc. before generating the real useful raw data.

Canon's CMOS sensor is more complex than this. CMOS sensor itself is basically an active sensor. (CCD is more like a passive capacitor device). Usually one CMOS pixel is made of four transistors plus one photo diode. This is why for the same pixel size, CMOS'es real light sensing area is much smaller than CCD's. Anyway, there is no real shutter in a CMOS sensor. Those 4 transistors are responsibile for signal &lification as well as voltage s&ling. Imagine how many signal & transistors exist in a CMOS sensor and how to achive uniform performance among them?

Even when the CMOS manufacturing technology is very matured today. For a chip/sensor size this big, yield is still a concern. The good thing about CMOS sensor is that when you have a defect, it can only be a noisy/less sensitive photo diode or a defective transistor instead of a dead pixel in a CCD sensor. You can peform some sort of signal reparing through on-die &lification to save the defects and make the sensor useable.

So you will have put signal processing circuits on the boundary of the chip to emulate the shutter behavior, to normalize the & gain, to cancel floor noise, to fix the defects, etc. The signal output from the CMOS sensor is not *raw* anymore.

Next, Canon still has to do some data calibration before outputing the raw data. More noise cancellation stuff there. You may want to check out this web page:

http://www.nucoretech.com/nu2/20_products/ipt/ndx-1260/00.html


Take a look at the functional diagram at bottom. You will notice a feedback loop around the 16-bit A/D called black-level calibration. You will see how complex it is.

It's great that the web master will invite people from Zeiss for an interview. You will hear that lens performance will only become more critical to digicams. Ever heard of the focusing problem of 10D?

Anyway, as I had said, if you like, I will be glad to discuss this topic more in emails.


-finney
 
> They CAN if the sensor has enough "resolution". On the D-30, no,
> lenses really didn't make much of a difference. It wasn't until
> the D-60 that people acknowledged that the better lenses were in fact
> giving better results.

Oops, James, sorry to miss your post. Was too busy with the work these days.

Anyway, you can not base your thought around pixel resolution. Image quality is more than just pixel counts. Lenses will still make
big differences on D30. It is just that people use mostly Canon lenses on D30. They had no idea how a Zeiss or a Leica will perform
on a D30. D30 does have enough dynamic range to show you the better color reproduction capability of Zeiss lenses.


> For me this is becoming a very interesting aspect of this discussion.
> This being the case, can the 'Point and Shoot' Sensor or its equivalents
> (Sony 1/1.8" CX452 sensor! ) really resolve to a sufficient level,
> to give the differences that are being perceived by those testing 5MP cameras?

Digicam is a balance art between optical design and circuit design. The answer is that 452 does have the capability to show the
difference on optical performance; however, the circuit design has to be good enough to preserve the difference. TVSD is one that
has the circuit clean enough to show you the difference. S50 is pretty much the opposite. It is interesting to know that Canon has
tuned down the NR level a little on G5 to give it better color rendition.


> I have done my level best to understand this aspect of Digital photography
> (nowhere near it yet like many others!); it creates a key decision point;
> ie is the TVSD really worth the extra money, or would the Konica KD-500
> (excellent Hexanon lens from what i can analyse) or Canon S50 do just a
> good a job of taking the picture that they generally used for? Size, Weight,
> Print Size, Quality and some of the features IMO remain the key decision points.
> Many aspects of the final result and print can be tweaked to personality preferences
> in PS can they not?
> What is the difference (absolutes or not) in this final result if derived
> from a 5MP Point and Shoot and a 5MP DSLR?

TVSD is not expensive. I paid around $670 for it, two months ago.

I have posted a way to verify the noises on another TVDS thread. My suggestion is that you'd better check out KD-500's s&le photos
first to see how its circuit performs. If the circuit is not clean enough or the NR is too much, the Hexanon will not do much
good. S50 is a good choice if you do not want to play with PS. You pay the price for bad tonality though. If tonality/gradation is
lost in the digicam, you will not be able to save it back in PS. If you have the money and really care about image quality, you
shall consider a DSLR.


-finney
 
Finney,

> Just a few Xilinx FPGA stuff does not mean much.

That's true. Is that all you've done? I've done over 100 Xilinx FPGA designs, and you can find me on their web site in their XPerts section...I'm also on their XEUC (Xilinx Expert User's Council)...and they even pay me to consult for them.

> Do you know I had > been involving in design Xilinx Vertex II/III FPGA chips?

Involved? Does that mean YOU did a design using the Xilinx Virtex chips (note, you didn't even spell it right...it's spelled Virtex... i, not e...)? What, exactly, do you mean by "involved"? It's a very amorphous claim, so it's hard to know what the significance of it is. Be careful what you claim...

> Now this is > not only a chip you have heard of, it is something you use a lot?

The VIII is not out, so how could I use it? And yes, I happen to have done quite a few V2 designs, and was on the design review team of the V3 (and I reviewd the V2 as well) for Xilinx.

> Happy now? ;)

Again, the significance of your amorphous statement has yet to be established, and really, isn't a place for this list.

> I will ignore you for now.

You only "ignore" me because you can't answer my challenges.

BTW, someone mentioned that my responses weren't that respectful to you...well, there is a good reason for this...because you have not earned my respect. I have not found that YOU treat me (or anyone who challenges you for that matter) with any respect, so accordingly, I simply don't find you deserve it. I have tried, but instead of actually stepping up to the plate and responding to issues technically, you skirt the technical issues, and make it personal.

> actually this is the main reason I have never > posted on contaxinfo... to keep it out of flame war!

Sounds like this is frequent for you.

> This is why I > prefer to have discussions in private emails.

How convenient. You can claim to be an expert on something, and not subject your self to any challenges.

> Your question regarding to A/D is very good. The clock speed of reading a CCD sensor > is fairly high now so clocking error is a big issue.

Clocking error? Specifically, what do you mean by that? If anything, any error in clocking is due to bad implementation/design, there is no need for this to be an issue. What frequencly do you believe is "fairly high"?

> Line noise is another issue.

Show me where line noise is an issue. It MAY be an issue in a poor design, but not if the design is well engineered.

> those > discrete components around are also an issue.

This is not an issue with a proper design, and for me, and other professionals with experience in the industry. This is just basic good engineering practice. Cold solder joints are an issue as well, but the level of issues you are raising is simply not an issue in proper designs. This sounds like Foveon advertising...make up things that don't exist (or are non-issues) so you can claim you've solved them. Hum.

> A/D conversion error > itself is also an issue.

What A/D conversion error? Do you mean +-1 LSB? That's not an "issue", it's inherent in the A/D process, and again, won't be seen...nor is there anything that can be done about it.

> You can not just directly use the data from > the A/D stage as raw data.

Well, yes, you CAN. All you need is the PRNU table...and you can do anything to the RAW data out of the A/D. How on earth can you claim that what you can do with the raw data is limited? That's just absurd...as ANYTHING that is done is done WITH this data, whether it's in the camera or outside of the camera.

> Most companies choose to compensate the > data with calibration data of the sensor such as geometric noises, > etc. before generating the real useful raw data.

It's called Photo-Response Non-Uniformity (PRNU), and is a compensation for each individual sensing element. It's simply a LUT (Look-Up Table). If you knew enough about how this really worked, you would know what it is called. Like I said, you seem to have a cursory knowledge, but no real experience.

So, if you have the PRNU table, which you do if you have the camera, you CAN get the RAW image data from the A/D and use it, and do ANYTHING to it you want to. You are not limited in what you can do with it, and the "processed" data does NOT give you any more information than the RAW data does. How do you think the out of camera Bayer pattern processing algorithms work?

> So you will have put signal processing circuits on the boundary of the > chip to emulate the shutter behavior...

Huh? The cameras, like the N Digital, USE the mechanical shutter IN the camera.

> The signal output from > the CMOS sensor is not *raw* anymore.

NO ONE ever said that RAW data was the data directly out of the sensing element. Raw data is unquestionably the data out of the A/D, period. No one has ever said any differently. This is YOUR misunderstanding, not anyone elses.

> Take a look at the functional diagram at bottom. You will notice a > feedback loop around the 16-bit A/D called black-level calibration. > You will see how complex it is.

This is something that someone MAY choose to use/do, but, whether it is necessary or not has yet to be shown. No high end digital cameras use it, nor do any digital backs...which provide the highest level of image quality.

Just as an FYI, this chip is NOT in any production camera right now. Basically, it's a chip that integrates the typical components that are found in any digital imaging system, plus adds some bells and whistles to distinguish themselves. Again, how well these bells and whistles work, much less for high end cameras and DSLRs has yet to be established.

So, you've managed to find A chip that isn't in any camera, and it's plain fact that none of the high end digital backs (or any of the current high end DSLRs that I am aware of) do any of this "processing" you claim happens (before the A/D) before you get RAW data...so again, what was your point if no one seemingly needs to do, or even does, this "processing"?

Austin
 
Dear Austin,

I suggest we keep to pixel discussions and not picky discussions! Where possible, we should maintain educational value to this site; which is why I value it so much.

Given this is a forum of Contax and Zeiss fans, or those about to be; ultimately i want to know how i can produce the best pictures possible for my budget. With the accelerated development of Digital, I want to know should i buy a digi or wait some more and film scan. Developments this year mean for me the decision is becoming more clear and it will cost me less!

My question to you becomes, what would you do buying into the market now? is the TVSD good enough or should i get the TVS? I am about to travel on business intensively; I am more excited about photography than ever (thankyou this forum) and want to ensure i get the best possible companion for my trips. Having suffered a Ricoh GR-1s die in Thailand, (I lost the film and the celebration i had captured) Contax quality for me is the way to go (in film at least). For home I will pick up the G2.

Yes i am certainly impressed by the technical expertise of you and Finney; but your energy is better spent.

Best, James
 
Dear all,

Sorry for bringing this whole stuff up. I especially feel sorry for my friend Finney, who was just meant to tell me some stuffs he know. He doesn't need all these. Despite all the arguments, it is pretty clear that lens itself would just become more and more important in the digital era, as sensors are coming nearer and nearer to the performance of film. I believe that's what Contax users care the most. The Zeiss lenses will still be there in the future.

Actually there is some information about comparing Zeiss lens to others based on a digital back on the Zeiss website. Alpa performed some testing among lenses from different vendors using the Sinar digital back, and the Biogon 45/3.8 had outperformed the others. The information, however, is very limited.

Dear Austin,

Regarding the "respect" issue, from the very first of your post, in my personal feeling, you showed some unrespect and made personal attack on "my friend working in IT industry". That made me feel very bad.

Regarding the technical issues, I believe it would not be hard to find out the right answer, which should be only one. From reading your posts, however, I feel that sometimes you and Finney are talking about the same thing, using different aspects. The attacks you were making sometimes does not contradict Finney's opinion. In genearl, I don't think it worth that much of arguments and debates.

Best, Shu-Hsien
 
Finney,

> Lenses will still make > big differences on D30. It is just that people use mostly Canon lenses > on D30. They had no idea how a Zeiss or a Leica will perform > on a D30. D30 does have enough dynamic range to show you the better > color reproduction capability of Zeiss lenses.

If you stated that it was your speculation, instead of making it like it's a fact, that would be different...but you state it like you KNOW what you are saying is true, and I don't believe you have any factual basis for your claim.

I quote directly from the D-30 review on the DPReview web site:

"I can't see any major colour or resolution advantage (at least with the 3 megapixel D30)"

And you can go download the charts your self and see. Please, point out the, and I quote "BIG difference".

> D30 does have enough dynamic range to show you the better > color reproduction capability of Zeiss lenses.

What, exactly, do you mean by "dynamic range"? Also, how does that "show...better color reproduction capability of Zeiss lenses"?

> Digicam is a balance art between optical design and circuit design. > The answer is that 452 does have the capability to show the > difference on optical performance; however, the circuit design has to > be good enough to preserve the difference

This one has me very curious...how does circuit design "preserve" the difference in optical performance? What aspect of "optical performance" are you talking about here? It certainly doesn't have a thing to do with resolution, what the sensor sees at an individual sensing element is what the sensor sees at that sensing element, and has not a thing to do with electronics, given a particular sensor. As far as the color fidelity, well, that's a different story...but that isn't a significant issue with a P&S, so I don't see what real meaning your statement has.

You make these amorphous, unqualified, unexplained statements, yet you complain that I make things too simple...perhaps some of your statements have some validity, but it's hard to tell by the statement in and of it self. You're the Lemelson of digital camera statements...

> If > you have the money and really care about image quality, you > shall consider a DSLR.

That's the first thing you've said here that I can completely agree with.

Austin
 
Shu-Hsien,

> Sorry for bringing this whole stuff up.

You should not be, I believe a number of people have said that they have gotten some good information from the "discussion".

> I especially feel sorry for my > friend Finney, who was just meant to tell me some stuffs he know. He > doesn't need all these.

If someone claims to be an expert in something, they should be able to stand the test of scrutiny, and be able to substantiate their claims.

This is not your fault at all.

> it is pretty clear > that lens itself would just become more and more important in the > digital era, as sensors are coming nearer and nearer to the > performance of film.

I agree completely, and I don't believe anyone said any differently.

> Regarding the "respect" issue, from the very first of your post, in my > personal feeling, you showed some unrespect and made personal attack > on "my friend working in IT industry". That made me feel very bad.

Well, this is a tough situation. Two things...you provided claims, that I believe were simply erroneous, from a third party...which could not be discussed with that party. Second, you tried to give this third party credence by claiming that s/he was in the "IT industry"...and my point was, that being in the IT Industry doesn't mean that someone has any knowledge of digital cameras. It wasn't meant as a slight at anyone at all, but simply a fact that having an IT background means they are involved with the seting-up/maintaining computers, typically in a corporate environment...not that they have professional experience with digital imaging engineering. They may very well have a personal interest in digital imaging and know quite a bit, but being from an IT profession doesn't indicate this. Again, it was not meant in any kind of demeaning way.

> Finney's opinion...

That's one of the problems I have with Finney, and he knows it. He statues things as facts, not as opinion. If they were stated as opinion, it would be much easier to discuss, and IMO, much less abrasive.

I am sorry you feel bad, but please, Finney gets what Finney asks for, and Finney knows that as well. Believe me, I am only here to give my experience/knowledge on subjects that I am knowledgeable in, and to learn as well.

I don't know what you do for a living, but imagine someone just showing up in your office and told you that things you know, and have known and verified first hand, are mistaken, and that they know more than you, and that you don't know what you're talking about...yet they fail to give any basis for their claims, or show any significant background that would lend their statements any credence. I think that would annoy most people.

Best regards,

Austin
 
Hi James

> I suggest we keep to pixel discussions and not picky discussions! > Where possible, we should maintain educational value to this site; > which is why I value it so much.

I agree, and I DO try to keep my comments technical, but that may not be possible in this situation, depending on what I am "hit" with.

>I want to > know should i buy a digi or wait some more and film scan.

That depends on what your needs are.

> Developments > this year mean for me the decision is becoming more clear and it will > cost me less!

True. Personally, I'd wait a year or two and see what shakes out. I believe that camera prices will come down to a much lower level, and the actual quality of image will be much higher per unit cost.

I still scan film, mostly B&W. I primarily only use a digital camera in the studio these days, where I use high end digital scanning backs on a Hasselblad. I have a reasonably high end film scanner that does an exceptional job at scanning B&W, as it scans B&W as B&W, not as RGB and converts, which every other current day film scanner does...and I typically only shoot B&W in medium format these days. I really like a lot of the new 4th layer films, especially the Fuji Press 800...no flash needed, and color balance is exceptional in mixed lighting conditions.

> My question to you becomes, what would you do buying into the market > now? is the TVSD good enough or should i get the TVS?

I have a TVS...an though I like it very much, it's not nearly as good as my Contax SLRs...both in use and image quality. I would say the TVS digital would probably be near equal to the TVS in image quality, though I can't say first hand. If those are your two choices, I'd get the TVSD...and I've actually been considering it.

> For home I will > pick up the G2.

That would be a nice combination, the G2 and the TVSD... But, have you considered the Canon D10 or what ever it is...the new under $2k DSLR? From what I hear, that is a really great camera...but for a P&S digicam, the TVSD is certainly a very good choice, and quite attractive.

> Yes i am certainly impressed by the technical expertise of you and > Finney; but your energy is better spent.

Thank you, and yes, I entirely agree.

Regards,

Austin
 
Austin,

> > Just a few Xilinx FPGA stuff does not mean much.
>
> That's true. Is that all you've done? I've done over 100 Xilinx FPGA
> designs, and you can find me on their web site in their XPerts
> section...I'm also on their XEUC (Xilinx Expert User's Council)...and
> they even pay me to consult for them.

Ok, so you are doing mostly FPGAs and simple asics, so what?
Does this mean anything regarding to multi-million non-FPAG designs?
You are actually the one who keeps making amorphous
and fuzzy claims. Consulting? What kind of consulting is it?
Xilinx was paying me big bucks to look into their
silicon problem. And how about your consulting work?
Big deal, huh? Do you want to compare how much XEUC paid you
and how much Xilinx paid me?

> > Do you know I had > been involving in design Xilinx Vertex II/III
> FPGA chips?
>
> Involved? Does that mean YOU did a design using the Xilinx Virtex
> chips (note, you didn't even spell it right...it's spelled Virtex...
> i, not e...)? What, exactly, do you mean by "involved"? It's a very
> amorphous claim, so it's hard to know what the significance of it is.
> Be careful what you claim...

Nothing to be careful here. Xilinx is my customer and we were
helping them solving design problems... and you are just an end user
of Xilinx chips.


> > Now this is > not only a chip you have heard of, it is something you
use a lot?
>
> The VIII is not out, so how could I use it? And yes, I happen to have
> done quite a few V2 designs, and was on the design review team of the
> V3 (and I reviewd the V2 as well) for Xilinx.


Design review team? Now I really get interested... which design
review team are you in? And what kind of design review have
you done? Design review can be a very fuzzy term, you know?
Who knows, probably you just gave 'design review' a new
meaning. You were so good at this and were doing this
a lot back at Contax Mailing. Now, you WILL have to be careful
with your claim.


> > Happy now? ;)
>
> Again, the significance of your amorphous statement has yet to be
> established, and really, isn't a place for this list.

You are the one who's making fuzzy statements.
And you got hurt simply because I showed people how much
you really knew.

> > I will ignore you for now.
>
> You only "ignore" me because you can't answer my challenges.

Sorry, your so called challenges are not up to my definition
of challenges. They are merely silly non-intellectual mumblings
which reflect your self-pity.

> BTW, someone mentioned that my responses weren't that respectful to
> you...well, there is a good reason for this...because you have not
> earned my respect. I have not found that YOU treat me (or anyone who
> challenges you for that matter) with any respect, so accordingly, I
> simply don't find you deserve it. I have tried, but instead of
> actually stepping up to the plate and responding to issues
> technically, you skirt the technical issues, and make it personal.

Believe me, I try it very hard. I even showed you that
to earn peoples repsect, you do not need to tell people
what company you run, which projects you have done, and
which organization you are in. You also do not have
to ask people to do the same to you. And you have NEVER learned.

Remember that YOU ARE the one who first posted the insulting
message and later on you fought simply because your weak
tiny self-ego got hurt. So, is this the way you
called responding to issues technically? Man, you
reall can spin new meaning out of old tricks.



> > Your question regarding to A/D is very good. The clock speed of
> reading a CCD sensor > is fairly high now so clocking error is a big
> issue.
>
> Clocking error? Specifically, what do you mean by that? If anything,
> any error in clocking is due to bad implementation/design, there is no
> need for this to be an issue. What frequencly do you believe is
> "fairly high"?

This really makes me laugh. Yeah, if your circuit has problem,
it must be bad implementation/design. How about this,
if your circuit has problem, you must have hired a stupid engineer?
You really have to hire an engineer with over 25 years of excellent
experiences! Hehehehe.

Actually you were answering the question all by yourself.
So you agree that the clocking will be a problem if the design
is bad, right? What do you think? Is 30MHz to 50MHz too low
for you? Yeah, right, easy for you. Show me how many 16bit 50MHz
A/D conversion stuff you have done! And the most important one,
how expensive the solution is!

> Line noise is another issue.
>
> Show me where line noise is an issue. It MAY be an issue in a poor
> design, but not if the design is well engineered.

Line noise is always an issue for an analog signal line
which is clocking at 50Mhz and you have to preseve
the signal integrity. This is not those low end
video cam stuff you have done, OK?


> > those > discrete components around are also an issue.
>
> This is not an issue with a proper design, and for me, and other
> professionals with experience in the industry. This is just basic good
> engineering practice. Cold solder joints are an issue as well, but the
> level of issues you are raising is simply not an issue in proper
> designs. This sounds like Foveon advertising...make up things that
> don't exist (or are non-issues) so you can claim you've solved them.
> Hum.

We are talking whether the problem exists or not. Do not dodge
the issue with words again like a good engineering practice
will not make it an issue... blah blah blah

Discrete components are always a concern, if you want to
get the best image quality. Capacitors along a 50MHz
analong line will always bring in noises. Making it works
is different from making it the best. Your bragging
statement only tells me what level of designs you
are dealing with mostly.


> > A/D conversion error > itself is also an issue.
>
> What A/D conversion error? Do you mean +-1 LSB? That's not an "issue",
> it's inherent in the A/D process, and again, won't be seen...nor is
> there anything that can be done about it.

If your knowledge regarding to A/D conversion error were only
limited to the +-LSB, I think we are in totally different leagues here.
Man, this is like the day when I had to explain
to peole that clock speed did not mean everything for
a CPU performance.

Since you claim to know it all, how about you explain this to me?
Why can an A/D converter introduce digital artifacts? Why?
And what kind of digital artifacts are they?

> > You can not just directly use the data from > the A/D stage as raw
> data.
>
> Well, yes, you CAN. All you need is the PRNU table...and you can do
> anything to the RAW data out of the A/D. How on earth can you claim
> that what you can do with the raw data is limited? That's just
> absurd...as ANYTHING that is done is done WITH this data, whether it's
> in the camera or outside of the camera.

Hehehe... this is getting funny again. I like the term PRNU
but people like to use a better term here. Anyway,
there is not much you can do with raw data is because
if the lens were bad, the raw data could not save you. If the sensor
were bad, the raw data could not save you. If some image details
were lost in the image conversion pipe, the raw data could not
save you.


> > Most companies choose to compensate the > data with calibration data
> of the sensor such as geometric noises, > etc. before generating the
> real useful raw data.
>
> It's called Photo-Response Non-Uniformity (PRNU), and is a
> compensation for each individual sensing element. It's simply a LUT
> (Look-Up Table). If you knew enough about how this really worked, you
> would know what it is called. Like I said, you seem to have a cursory
> knowledge, but no real experience.
>
> So, if you have the PRNU table, which you do if you have the camera,
> you CAN get the RAW image data from the A/D and use it, and do
> ANYTHING to it you want to. You are not limited in what you can do
> with it, and the "processed" data does NOT give you any more
> information than the RAW data does. How do you think the out of camera
> Bayer pattern processing algorithms work?

Hahaha... so this is just how much you know. Do you ever know
that doing a color table look up fixing after the A/D stage will
create color phase noises? Sure, you have never seen this
since you are doing mostly low end video cam stuff.


> > So you will have put signal processing circuits on the boundary of
> the > chip to emulate the shutter behavior...
>
> Huh? The cameras, like the N Digital, USE the mechanical shutter IN
> the camera.

*Sigh* You also do not know how the shutter works on a CMOS sensor.


> > The signal output from > the CMOS sensor is not *raw* anymore.
>
> NO ONE ever said that RAW data was the data directly out of the
> sensing element. Raw data is unquestionably the data out of the A/D,
> period. No one has ever said any differently. This is YOUR
> misunderstanding, not anyone elses.

You missed the point. I was saying that the signals coming
out of Canon's CMOS sensor have been highly processed already.


> > Take a look at the functional diagram at bottom. You will notice a >
> feedback loop around the 16-bit A/D called black-level calibration. >
> You will see how complex it is.
>
> This is something that someone MAY choose to use/do, but, whether it
> is necessary or not has yet to be shown. No high end digital cameras
> use it, nor do any digital backs...which provide the highest level of
> image quality.
>
> Just as an FYI, this chip is NOT in any production camera right now.
> Basically, it's a chip that integrates the typical components that are
> found in any digital imaging system, plus adds some bells and whistles
> to distinguish themselves. Again, how well these bells and whistles
> work, much less for high end cameras and DSLRs has yet to be
> established.
>
> So, you've managed to find A chip that isn't in any camera, and it's
> plain fact that none of the high end digital backs (or any of the
> current high end DSLRs that I am aware of) do any of this "processing"
> you claim happens (before the A/D) before you get RAW data...so again,
> what was your point if no one seemingly needs to do, or even does,
> this "processing"?


You are wrong again. This is really annoying that
I have to type the word 'wrong' so many times in one email! ;)

Anyway, the Nu Core chip you saw is already the second generation chip.
People are using the first chip already. A few new designs with
the new chip are coming. The key concept is that if you
want to reduce the noises, you'd better solve this analog problem
with analog solution. If you do it at the digital stage,
it will be a bit late and it will introduce digital artifacts.
Digibacks like Phase One's have been trying to do similar things.
They even put temperature compensation stuff in.

Man, am I hungry? Time for the supper now! Yummy!

This time, I will really ignore you for now.
happy.gif


Or, try to make up some real challenges, OK?

-finney
 
Back
Top