We all know if we want to measure the color gamut of a display or printer it's pretty straight forward, with a display you just get a good spectrometer with the appropriate software. But how do I measure the gamut of an analog source device. I'm concerned that the data could be skewed if I'm relying on a display or capture card. What's the best method?
Link Posted: Wed Feb 22, 2017 8:41 am Post subject:
I don't think that You can do more than measuring the individual signal levels of the red, green and blue channel. Those would have to be equal to each other whenever the source device outputs any kind of grey to white picture. With an analog RGB source, this usually will be done using an oscilloscope and by measuring each signal amplitude.
Normally, color calibration includes the entire video chain. As soon as the light output reaching the viewer's eye is calibrated, it won't matter how much deviation which device produced with regard to the original signal. The only thing that may happen, ist that one device in the chain (usually the display) has limited capabilities and therefore will not pass a portion of the original signal (e.g. smaller color space). Normally, these missing components cannot be restored using calibration only...
Joined: 06 Mar 2006 Posts: 16793 Location: Ottawa, Canada
TV/Projector: JVC DLA-RS56
Link Posted: Wed Feb 22, 2017 2:15 pm Post subject:
Barclay is correct. Calibration at anything but the final point doesn't really make sense. Just make sure the source is passing the entire signal (which they pretty much all do otherwise it would be completely incorrect).
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum You cannot attach files in this forum You can download files in this forum