If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#41
|
|||
|
|||
why device independent color?
In article ,
Martin Brown wrote: On 27/01/2014 22:48, Eric Stevens wrote: On Mon, 27 Jan 2014 10:29:50 -0800, isw wrote: And just to keep things sort-of on topic, despite its limitations, NTSC's color space was larger than that of any other commercial color reproduction technique that existed at the time. (And that includes color photographic film). Wow! That brings us back to Dale's original topic. You couldn't say "NTSC's color space was larger than that of any other commercial color reproduction technique that existed at the time" unless you had a device independent space (such as "XYZ, CIELAB, CIELUV) through which you can connect them. Many thanks. :-) Yes you could by showing that the other colour spaces gamut could be represented as a subset of the NTSC colour space. I am not convinced the claim is true about NTSC though it was for a while a de facto colour space standard in practice. You could just compare the area enclosed by the CIE coordinates of its primaries to the others (which IIRC, was the origin of the claim ...) I believe that claim came out at the time of the introduction of the NTSC system. It would have been made with reference to the original red phosphor, which was rather poor in light output and had a short lifetime from being driven hard. The replacement red phosphor was much more robust, but was located at different CIE coordinates (naturally). I do not know whether the gamut claim was still true with it or not -- but I think it was. Isaac |
#42
|
|||
|
|||
why device independent color?
isw wrote:
In article , Martin Brown wrote: On 24/01/2014 23:47, Eric Stevens wrote: On Fri, 24 Jan 2014 12:13:47 -0500, nospam wrote: In article , Dale wrote: what is needed is a colour managed workflow, with the image and each device along the way having a profile. that's how you get the profiles no, you get the profiles by running the appropriate profiling software. what the software does internally doesn't matter. users do not need to understand all the math behind it to be able to use it. what matters is does the user get what they expect, and the answer is yes. You are missing the point of Dale's original comment: "you need to convert the device colors through device independent color space like XYZ,CIELAB,CIELUV". But that is clearly not true! It is a lot more convenient to convert to a device independent colour space and from there to whatever output medium you want to use because the number of profiles need for N different image sources and M destinations is limited to N+M colour profiles. But you could with a *lot* more work compute direct colour profiles for every possible combination of source and destination N*M. In the early days when N was about 3 and M was about 4 that was what happened. It may still make a lot more sense to store the original image in the colour space where it was measured and only ever compute the device independent form as a hidden step on the way to the output device. What do you do years later, when all information about the creating device's characteristics are long gone, and all you have is an image file? You know about embedded profiles, right? BugBear |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Independent artists and other players | [email protected] | Digital Photography | 0 | March 3rd 08 02:20 PM |
Independent photography | Jem Raid | Digital Photography | 0 | October 27th 07 11:14 PM |
Independent Photographers - Birmingham UK | Jem Raid | Digital Photography | 0 | February 3rd 06 08:03 PM |
Birmingham (UK) Independent Photographers | Jem Raid | Digital Photography | 0 | January 28th 06 10:36 AM |