CI Photocommunity

Register a free account now!

If you are registered, you get access to the members only section, can participate in the buy & sell second hand forum and last but not least you can reserve your preferred username before someone else takes it.

Regarding Contax N digital and Leica digital back

loom

Active Member
Hi everyone,

I talked with the friend of mine who works in San Jose after the digiback for Leica R8 was announced, and once again he gave me very useful comments, which is as follows:

-----------------------------------------------
The CCD N digital incorporated, the 3020 has a pixel size around 12um. The true size is around 11um. Canon's 1Ds is 8.8um. Since it's a CMOS sensor, the real size is probably a little smaller than 6um. See the difference here? The 3020 has a much higher dynamic range on colors.
Not even mention the CMOS sensor has a smaller
dynamic range in nature already, for the same pixel size.

So the comparison of ND vs 1Ds on Contaxinfo is basically a comparsion about color resolution v.s.pixel resolution.

Another good thing about ND is the special anti-aliasing/ low pass filter Kyocera developed for ND. It can preserve high contrast detail better than other digicam's. Some people claim that ND has done too much sharpening so its photos look so contrasy... haha... a big no no. It is really the anti-aliasing filter which does the job!

As I had told you before, the problem with the Philips sensor is that it can eat up lots of power and can get hot quickly. It is a full size 35mm sesnor but in reality, it is only good for digiback to use. ND is basically a scaled down 120 camera plus a digiback. You need to have a different mentality when you use it. You can not just use it like using a D100 or a 10D, keep firing away wherever you you go. If you expect it
to perform like a D100 or a 10D, surely you will be disappointed. On the other side, if you use it
in studio only or you can control the lightling
preceisely, take your time to take every picture,
ND will give you great result... the nice color gradaution,good contrast, and great shadow details. As for noises, you can alwas use software to remove them.

Many people have incorrect perception on raw files. They all think the raw files are the image data dumped directly from the CCD/CMOS sensor. The reality is that there are lots of things needed to be done from the analog signal in the sensor to the digital raw data. Every company has its own way of creating the raw image data. The raw data clwu played with first is actually the CCD test data which will be used as calibration data in the image processor to generate the real useful raw file. On the other side, even the signals dumped from those Canon CMOS sensors have already been processed by the on-chip circuits, they are not *raw* anymore.
In other words, raw image files are not as raw as the words seem. Most of them have been heavily processed. What you can do with those raw data is actually very limited. If you think just by adjusting the color curves....etc and you will get something like Zeiss color, this is a big joke. If you really think you can play magic with raw data file, how about this? Take a portrait picture of a lady with both D60 and 10D(both use exactly the same CMOS sensor)at ISO400, and show me that you can adjust the raw files to make them look exactly the same? Since ND and Phase One's digiback are using the same sensor, too, how about show me that you can make ND's photos look as good as Phase Ones?

The key point is that, the better the digicam is,
the more import the lens performance it will be.
If you can not see this point, probably you will
be able to tell the difference between a Nikon lens and a Contax CY lens anyway.

There are lots of things about a lens... the flare control, the bokeh, the shadow detail,
the color gradation... they will all show on
good sensors. The difference is there, just some
people refuse to admit this.

Oh, the N mount is still the best platform for DSLR among all 135 systems. Now the real problem
is Kyocera has to find a good sensor for it.CMOS sensor is really not a solution.

As for the Leica digiback. It is a good move but pretty much it's just an Imacon product. There is no reason that Imacon can not do the same for Contax... the issue is the price. Also, the 6.8um sensor size is too small. This will degrade Leica lenses' advantages on tonality. BTW, you know you can pay around US$12K to modify a RTS3 to a DSLR?
----------------------------------------------
 

afranklin

Well-Known Member
> I talked with the friend of mine who works in San Jose after the > digiback for Leica R8 was announced, and once again he gave me very > useful comments, which is as follows:

I take it this is the same friend who is an "IT Professional"? I'm not sure why that caries any credence to his/her statements and how it relates to the subject of digital photography...as IT has to do with setting up computers and installing software on them, and mainting them, not a thing to do with "technology" in general...but none the less...

> The CCD N digital incorporated, the 3020 has a pixel size around 12um. > The true size is around 11um. Canon's 1Ds is 8.8um. Since it's a CMOS > sensor, the real size is probably a little smaller than 6um. See the > difference here? The 3020 has a much higher dynamic range on colors.

I am not sure what s/he is trying to say here, but dynamic range has not a thing to do with sensor element size. And, the type of sensor has not a thing to do with the sensor element size either.

> Another good thing about ND is the special anti-aliasing/ low pass > filter Kyocera developed for ND.

In reality, it's no better/worse than anyone elses. Fact is, the higher the resolution of the sensor the less of a need for this. What, exactly, is "special" about it?

> As I had told you before, the problem with the Philips sensor is that > it can eat up lots of power and can get hot quickly. It is a full size > 35mm sesnor but in reality, it is only good for digiback to use. ND is > basically a scaled down 120 camera plus a digiback. You need to have a > different mentality when you use it. You can not just use it like > using a D100 or a 10D, keep firing away wherever you you go.

The two are NOT related. The speed the sensor can be fired at is directly related to the ability to read out the data from the sensor, and put it somewhere. That is by architecture, and has nothing to do with power and/or heat.

> Many people have incorrect perception on raw files. They all think the > raw files are the image data dumped directly from the CCD/CMOS sensor. > The reality is that there are lots of things needed to be done from > the analog signal in the sensor to the digital raw data.

The RAW data is the data directly out of the A/D, without any setpoints, tonal curves and/or Bayer pattern processing. That's it, that's raw data, period. The signal from the sensor is just that, an analog signal and is not really "data" at that point, only after the A/D conversion process is it really "data".

> In other words, raw image files are not as raw as the words seem. Most > of them have been heavily processed.

That's not true. Show me one ex&le of that from a major manufacturer.

> What you can do with those raw > data is actually very limited.

In what sense? You can do anything to it your tools allow you to do. In fact, there are aftermarket Bayer pattern processing programs that can do a better job than the ones inherent in the camera, or shipped with the cameras, and the images come out better. Where is the limitation?

> Take a portrait picture of a lady with both D60 > and 10D(both use exactly the same CMOS sensor)at ISO400, and show me > that you can adjust the raw files to make them look exactly the same?

It depends on what you mean by "exactly the same". You can take two pictures with the same camera and they can pretty much not be made "exactly the same", again, depending on what you mean buy "exactly the same". I COULD make them indistinguishable from a standard viewing distance, but can I make them bit for bit the same, probably not...even if taken with the same camera.

> Also, the 6.8um sensor size is > too small. This will degrade Leica lenses' advantages on tonality.

By what premise do you believe that sensor element size has a thing to do with tonality?

Some of his/her statements are sure enough true, most of it simply common knowledge and readily available on the web, but there is quite a bit of it that simply isn't correct. With your "friend" not here on the list, it makes it rather difficult to discuss anything s/he says, and makes her/his comments not part of a conversation, and therefore difficult to address.

Regards,

Austin
 

loom

Active Member
Hi Austin,

Sorry for a deviated description of the background of the friend I've mentioned. Basically he was the one who brought me into the world of Zeiss indirectly, and he has 10+ years of photography and now currently owns a variety of equipments, including a comprehensive C/Y lineup, Hass and a Nikon F100. He holds a computer science PhD from Stanford and now operates his own company doing computer chip simulation stuff in San Jose. He has a very different background from mine, which is medicine and therefore I am unable to understand 100% what he is actually doing with his profession. Besides from issues regarding digital photography we share our opinions/experiences in film photography and a little bit on computational biology, which I am now working on toward a PhD. Of course he is a much more experience amateur photographer than I am. I've only used G2 with 4 lenses and NX with 70-200 for 4 years so far. I began to ask his opinion in digital photography when I can not stand the result my Canon S30 brings, and others simply followed. In addition, he seems to have good relationship with Kyocera guys and companies developing chips used for digital photography. Not too long ago he told me that Kyocera is now talking with NuCore, a company which developed a one-solution noice-reduction chip to do noice reduction at the analog and A/D level, which was once only can be incorporated in medium format digibacks, which allow more room in designs. Panasonic, from his telling, has already decided to use NuCore's chip.

Please, Log in or Register to view URLs content!


Basically he wrote the letter in a relaxed state and some of the arguments might need refinement. Most of the letter started from a post in a Taiwanese website stating that in digital era there will be no differences if you use different lenses, since it is all the CCD and post-processing that make the difference.

Regarding the RAW, from my understanding is that the A/D process itself has already incorporated much processing. It is not that "raw" as you and me had imaging. Of course you can obtain different results using different tools, just as he personally hightly recommends a noise-reduction software. His point was simply that by playing with image processing software you cannot make the images from different lenses look identical. I am actually one of his believer, since however hard I play with the images from my Canon S30 I cannot get something similar to TVSD, not to mention the images I scanned from the slides taken from G2. I know the aformentioned comparison might not be a fair one, but I am simply saying that the lenses do make difference.

Regarding the sensor size, in my two cents is that the smaller the sensor size, the less light a sensor would receive, and that would affect tonality. I might be wrong, since I am not an expert, but I am for sure that the sensor size itself play an important roll in image quality.

English is not at all the native tongue for him and for me, and sorry if I had made any wrong statements or description.

Best, Shu-Hsien
 

afranklin

Well-Known Member
Hi Shu-Hsien,

Thanks for his background, but still, that, IMO, doesn't carry much weight as someone who has actually designed. There are a LOT of armchair digital experts out there, and unfortunately, they all have very strong opinions...some right and some completely wrong, so every statement these "experts" make can simply confuse matters more equally as much as they may clarify things.

> in digital era there will be > no differences if you use different lenses, since it is all the CCD > and post-processing that make the difference.

That, to a large degree is absolutely true...until recently, when the sensor pitch (size of the physical sensing elements) has now reached a point where optics does matter.

> Regarding the RAW, from my understanding is that the A/D process > itself has already incorporated much processing.

Sorry, but that is absolutely incorrect. All the A/D process does is simply convert the voltage (or current in some cases) to a number, period. NO processing at all is done, a simple linear (in typical cases) conversion is done. The signal path is quite simple...CCD analog signal to an analog circuit that simply voltage (or current in some cases) matches the range of the CCD output to the range of the A/D input then to the A/D. For ex&le, if the CCD as a voltage output of 0-2.5V, and the A/D has a voltage input requirement of -3 to +3 volts, the analog "front end" will simply apply an offset and then a gain so that it maps 0-2.5V into a range of +-3V (6V peak to peak).

> ...I am simply saying that the lenses do make > difference.

They CAN if the sensor has enough "resolution". On the D-30, no, lenses really didn't make much of a difference. It wasn't until the D-60 that people acknowledged that the better lenses were in fact giving better results.

> Regarding the sensor size, in my two cents is that the smaller the > sensor size, the less light a sensor would receive, and that would > affect tonality.

To a point that would be true, but not at the levels we are talking about here.

> English is not at all the native tongue for him and for me, and sorry > if I had made any wrong statements or description.

Possibly, but I think that he might be missing some understanding as to how a digital camera actually works. Nothing wrong with that, but it is a problem when someone speaks with authority, and doesn't have the story straight. BTW, your English is perfect.

Regards,

Austin
 

jsmisc

Well-Known Member
Hi Austin,
I have seen you refer to Bayer pattern quite often in your posts but in my ignorance I don't know what this is. Digital is a steep learning curve with which I am struggling and I would much appreciate your explanation.
Thanks,
John
 

james

New Member
> ...I am simply saying that the lenses do make > difference.

They CAN if the sensor has enough "resolution". On the D-30, no, lenses really didn't make much of a difference. It wasn't until the D-60 that people acknowledged that the better lenses were in fact giving better results.

For me this is becoming a very interesting aspect of this discussion. This being the case, can the 'Point and Shoot' Sensor or its equivalents (Sony 1/1.8" CX452 sensor! ) really resolve to a sufficient level, to give the differences that are being perceived by those testing 5MP cameras?

I have done my level best to understand this aspect of Digital photography (nowhere near it yet like many others!); it creates a key decision point; ie is the TVSD really worth the extra money, or would the Konica KD-500 (excellent Hexanon lens from what i can analyse) or Canon S50 do just a good a job of taking the picture that they generally used for? Size, Weight, Print Size, Quality and some of the features IMO remain the key decision points. Many aspects of the final result and print can be tweaked to personality preferences in PS can they not?

What is the difference (absolutes or not) in this final result if derived from a 5MP Point and Shoot and a 5MP DSLR?
 

afranklin

Well-Known Member
> What is the difference (absolutes or not) in this final result if > derived from a 5MP Point and Shoot and a 5MP DSLR?

The odds are, a P&S uses what is called an "Interline Sensor", which is far inferior in image quality to a standard "one shot" type of image sensor used in higher end cameras. The way to tell if the camera is using an Interline Sensor is if it has real time preview...which is a property od the Interline sensors and not of the One-Shot sensors. The Interline Sensors are made, primarily (if not initially), for video applications...which is why they don't use a shutter and can provide real-time preview...but these sensors, as I said, give lower image quality.

Regards,

Austin
 

afranklin

Well-Known Member
> I have seen you refer to Bayer pattern quite often in your posts but > in my ignorance I don't know what this is. Digital is a steep learning > curve with which I am struggling and I would much appreciate your > explanation.

The Bayer pattern is a pattern of colored filters in the arrangement of Red/Green/Blue/Green, one color over one sensing element. The physical sensing elements themselves (not talking Foveon here) are monochromatic, and in order for them to "see" any color, a color filter has to be used.

Now, the reason for Bayer pattern, is you need the three colors worth of data in order to encompass a large part of the visible color spectrum. No filters, and you get grayscale. But, since you can only have one filter of one color over "a" sensing element, you need this quad pattern to get this range of colors. Two greens are used for contrast BTW. So, in reality, a 6M sensor has %50 of the sensing elements providing Green information, %25 Blue and %25 Red. This leads to a reduce resolution of actual information. This reduction is made up for by a Bayer pattern processing algorithm which interpolates (mathematical interpretation of the values "in between") which fills in the missing values.

Does that give you a reasonable idea?

Regards,

Austin
 
K

kajot

Posted by Austin Franklin on Wednesday, July 02, 2003 - 3:25 pm:

Now, the reason for Bayer pattern, is you need the three colors worth > of data in order to encompass a large part of the visible color > spectrum. No filters, and you get grayscale. But, since you can only > have one filter of one color over "a" sensing element, you need this > quad pattern to get this range of colors. Two greens are used for > contrast BTW. So, in reality, a 6M sensor has %50 of the sensing > elements providing Green information, %25 Blue and %25 Red. This leads > to a reduce resolution of actual information. This reduction is made > up for by a Bayer pattern processing algorithm which interpolates > (mathematical interpretation of the values "in between") which fills > in the missing values.

Hello!

Does it mean that in fact the 5Mpix camera has only ca 1.25 M "full color" pixels???

Regards, Krzysztof Janus
 

paulcontax

Well-Known Member
I'm sure that the lens makes a difference even with 2 or 3 MP cameras ! I've tested several point-and-shoot (p&s) digis and found the Sony DSC-S70 with the Carl-Zeiss-lens really the best then (some years ago). Much better than a Konica, Fuji or some cheap cams. No sensor can capture what the lens doesn't give him ! That's the point with the better 5 MP-cams : It's stil the optics that makes the difference ! and I love my Olympus E20p even it's quite big but very well built, a real camera not like a toy. A real SLR (not a viewfinder), very fast compared with the p&s and a very good and fast lens. Paul
 

afranklin

Well-Known Member
> Does it mean that in fact the 5Mpix camera has only ca 1.25 M "full > color" pixels???

Well, no...not really. The luminance information is still pretty intact, which is more important than the chrominance information, which is reduced. Typically, and this is very image/algorithm dependant, it's about a %20 degredation in image "fidelity" because of the Bayer pattern.

Regards,

Austin
 

afranklin

Well-Known Member
> I'm sure that the lens makes a difference even with 2 or 3 MP cameras

With a P&S camera, how do you know it's the lense, as you can't replace the lense to test out other lenses?

Of course the lense CAN make a difference, but only to a certain point. For the P&S cameras, their sensors are actually very fine pitch (close spacing of smaller sensing elements) because of the type of sensor they are (Interline), so they are actually more sensitive to lenses than the D-SLRs are...but their image quality is far below that of most any D-SLR. Remember, these little digi P&S cameras use a VERY small sensor...but again, you really can't compare images from any P&S digi to a decent D-SLR...except may be the TVS digital ;-)

Only recently, have the better lenses mattered on the D-SLRs, as evidenced by the D-30 and D-60 test reports.

Regards,

Austin
 
K

kajot

> Well, no...not really. The luminance information is still pretty > intact, which is more important than the chrominance information, > which is reduced. Typically, and this is very image/algorithm > dependant, it's about a %20 degredation in image "fidelity" because of > the Bayer pattern.

Now I understand! Thanks a lot:)

Krzysztof
 

keoj

Active Member
I have found Norman Koren's website to be THE BEST in understanding the lens vs CCD-CMOS sensor vs film limits. The link to this specific topic is
Please, Log in or Register to view URLs content!
but his entire website and treatment of topics is outstanding.
 
F

Finneybear

Okay okay, good old Austin, I, am the mysterious IT friend Sheu talked about.

Very interesting, other than Contax Mailing list, now you are trying to grow some believers here, hum? Miss the fun time how I teased you about anti-aliasing filter on Contax Mailing list? You still haven't learned, have you?

Anyway, let's say that I have done quite a few chips for Sony and Sanyo. I have dealt with Foveon and have a very close relationships with National Semi which has done a few projects with Foveon. Is this clear enough?

Big part of digicams is all about chips and image processing. Chips are not really your territory and you do not have to pretend that you know a lot.

Anyway, the A/D stage is not as simple as what you have imagined. A cheapie 10 cents A/D can produce a result totally different from a $10K one. Clear enough? Ever wonder why ND is so much cheaper than a Phase One digiback?

The Norman Koren's website is a good beginning for people who are interested in getting to know digital photography technology; However, the stuff he talks about is still very simple and superficial. Most infos are correct but still, minor mistakes here and there. For instance, the discuss about a sensor's pixel size and its performance issue is not 100% correct.

It can take me hours just to explain the difference between CCD and CMOS. Too bad that I do not have the time to do this now.

-finney
 
F

Finneybear

BTW, for people who want to know more about the comparison between film and digic sensor, you can check out this paper. There are many similar papers around but I could not find them over net though:

Please, Log in or Register to view URLs content!


-finney
 
M

mike_nunan

Hi Finney,

As an interested observer of this forum in general and this discussion in particular, I would like to urge you in the two ways following:

1. Please avoid entering into a flame war on this board, it is a place which has been refreshingly free of this up until now and I suspect most of the regulars here would not like that to change. I appreciate that some of the comments made towards you by Austin were not exactly respectful, but in the end if you have solid understanding of the issues then that will shine though and the matter will be closed in your favour.

2. Please share the specifics of what you *do* know, as regards the various technical aspects that you have alluded to in your postings and the message of yours that Sheu posted at the start of this thread.

On the second point, I'd particularly like to know more about the transformations or processing that take place in the ADC. Your point about the quality difference between different A/D designs is clearly valid, but I would have thought the only relevant design parameters are linearity and noise floor (obviously there are other parameters such as s&le-and-hold time but hopefully we can assume that the system designers have chosen components that give sufficient margin in this area). If there are digital processing steps that are applied to the converted data before it's saved as a RAW image, then that changes the rules of the game completely. Are you suggesting this is what happens in the Canon designs and others?

Thanks in advance,

-= mike =-
 

dirk

CI-Founder
Hi everybody,

since the interest in digital cameras and the whole pro and cons incl. technology are attracting more and more interest among Contax users, we will have in autumn some kind of interview/ Q&A with some people from Carl Zeiss at Contaxinfo.com.

We can not yet disclose the exact date, but it will be definitely after August, because of vacation time. There will be different issues discussed and we will also offer all Contaxinfo.com members the possibility to send us their burning questions, which can be then answered by employees of Carl Zeiss.

This will be probably a recurring event on a unregular basis. Subjects can be for ex&le insights in digital photography, the N-lenses and the MF-lenses, recent changes in lens design and their influence on digital photography etc.

But we will announce this all via our newsletter as soon as we can present more facts and the time frame for the first event.

Dirk
 

afranklin

Well-Known Member
Hi Finney,

> Very interesting, other than Contax Mailing list, now you are trying > to grow some believers here, hum? Miss the fun time how I teased you > about anti-aliasing filter on Contax Mailing list? You still haven't > learned, have you?

I have no idea what you're talking about, but I know I have encountered you on the Contax list before, but I don't remember what the issue was, nor am I particularly interested in YOUR interpretation of what transpired. Stick to the here and now.

> Anyway, let's say that I have done quite a few chips for Sony and > Sanyo. I have dealt with Foveon and have a very close relationships > with National Semi which has done a few projects with Foveon. Is this > clear enough?

Clear enough for what? I'm not clear what you are trying to claim here. Though it may to some who don't understand the field well enough to understand the significance, or insignificance, of what you said...it certainly doesn't mean anything to me that signifies that you know what you're talking about with respect to the subject at hand (or really any subject for that matter). I've designed over 100 ASICs, and been designing digital imaging devices (amongst many other things) for over 25 years... I am a professional engineer. I'm either on, or have been on, corporate staff, or consult for some major players in the EDA, computer and imaging industry.

What you say (actually write) is what I base my opinion on...not your purported qualifications, which, no offense meant, as I said, don't mean you know what you're talking about with respect to digital imaging.

> Big part of digicams is all about chips and image processing. Chips > are not really your territory and you do not have to pretend that you > know a lot.

I don't have to pretent anything, as I've said, I HAVE designed over 100 ASICs, so chips ARE my territory and I DO know a lot in the area of digital imaging and ASIC design. We own our own Synopsys, VCS and Synplify seats (actually, we were seeded by these companies because of our dominant position in the ASIC/FPGA design indusrty). I currently own a company that does ASIC/FPGA and board level development.

> Anyway, the A/D stage is not as simple as what you have imagined. A > cheapie 10 cents A/D can produce a result totally different from a > $10K one. Clear enough?

Being that I've designed quite a few A/Ds, and have used just about any of them in the market place from the 10 cent ones to some of the most expensive ones (mostly for audio applications in that case), yes, I know quite a bit about A/Ds, AND...what they do is convert voltage/current to a number, period, end of sentence. NO processing is done. What is different between A/Ds is how accurate and repeatable and fast they are. All that is needed is for the characteristics of the A/D to be matched to the characteristics of the CCD. Having a better A/D than the CCD does you NO good.

> Ever wonder why ND is so much cheaper than a > Phase One digiback?

More limited market, smaller sensor, no follow on income from lenses...to name a few reasons.

> However, > the stuff he talks about is still very simple and superficial.

You can call it what you want, but it IS correct information, and is clearly to the point of the conversation. Clouding it with anything more technical does not serve any purpose here, that is unless you specifically want to obfuscate the issue.

> Most > infos are correct but still, minor mistakes here and there.

Like what, specifically?

> For > instance, the discuss about a sensor's pixel size and its performance > issue is not 100% correct.

Specifically what? Be specific here, Finney...you want to come across as being some expert on something, and if you're going to criticize someone, do so with facts...be specific. Also, please don't try to diffuse the issue with tangenting. Stick to a specific statement you believe is incorrect/inaccurate and say why, and show some substantiation. Just saying something is wrong doesn't really mean anything.

Just keep in mind I AM a professional engineer, who has over 25 years of experience designing this stuff. And, I don't need someone attacking my credibility based on some unsubstantiated background. You've said you've designed "quite a few" chips, but never said if they were in any products, or what they actually do...and "quite a few" chips is what seniors in college have designed, so I don't hold much credence in that without knowing more. I DO know that some of your statements are not in agreement with my actual design experience, and that of many of the other professionals in the industry that I work with...what's the explanation for that? Are we all doing it wrong for the past 25 years? Somehow I doubt that.

If you want to see some of what I've designed, then simply go to my web site...
Please, Log in or Register to view URLs content!
. I am currently designing the rendering/LAN (Infiniband) board for a 1024 node super computer visualization system. Like I said, digital imaging IS one of my areas of expertise, as is ASIC/FPGA design.

Austin
 
Top