View previous topic :: View next topic |
Author |
Message |
panzeroceania
Joined: 23 Sep 2011 Posts: 24 Location: Salem, Oregon
|
Link Posted: Wed Feb 22, 2017 4:17 am Post subject: How can I measure the color gamut of an analog source device |
|
|
We all know if we want to measure the color gamut of a display or printer it's pretty straight forward, with a display you just get a good spectrometer with the appropriate software. But how do I measure the gamut of an analog source device. I'm concerned that the data could be skewed if I'm relying on a display or capture card. What's the best method?
|
|
Back to top |
|
|
barclay66
Joined: 27 Jun 2011 Posts: 1291 Location: Germany
TV/Projector: Marquee 9500 Ultra
|
Link Posted: Wed Feb 22, 2017 8:41 am Post subject: |
|
|
Hi,
I don't think that You can do more than measuring the individual signal levels of the red, green and blue channel. Those would have to be equal to each other whenever the source device outputs any kind of grey to white picture. With an analog RGB source, this usually will be done using an oscilloscope and by measuring each signal amplitude.
Normally, color calibration includes the entire video chain. As soon as the light output reaching the viewer's eye is calibrated, it won't matter how much deviation which device produced with regard to the original signal. The only thing that may happen, ist that one device in the chain (usually the display) has limited capabilities and therefore will not pass a portion of the original signal (e.g. smaller color space). Normally, these missing components cannot be restored using calibration only...
Regards,
barclay66
|
|
Back to top |
|
|
kal Forum Administrator
Joined: 06 Mar 2006 Posts: 17860 Location: Ottawa, Canada
TV/Projector: JVC DLA-NZ7
|
|
Back to top |
|
|
|
|