Ad
Ad
Ad
Pages: « 1 2 [3] 4 »   Bottom of Page
Print
Author Topic: Correcting images in LAB vs RGB  (Read 17509 times)
joofa
Sr. Member
****
Offline Offline

Posts: 485



« Reply #40 on: February 26, 2010, 01:46:20 PM »
ReplyReply

Quote from: digitaldog
Having color appearance models and having color appearance models implemented in imaging software are two very different things.

What products have such color appearance models?

Hi DigitalDog,

I think I was trying to point out if products are not out there that could have certain capabilities of a certain space, then its a product's responsibility to modernize, and not necessarily the fault of the color space.

On a different note, I shall have a look at Photoshop API again to see if some of the above-mentioned stuff may be implemented as plugins which any 3rd part out there can write. It has been a while I looked at Photoshop SDK as I do most of my image manipulation stuff in Adobe After Effects, since I am more used to it, its API/SDK is very clean, and I can do most of the stuff I need to do to images in After Effects.

Logged

Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
Hening Bettermann
Sr. Member
****
Offline Offline

Posts: 539


WWW
« Reply #41 on: February 26, 2010, 02:58:13 PM »
ReplyReply

Thank you for your answers. Sorry that I had overlooked that Bruce had already answered my 1. question.

> Are you sure you’re not thinking about an L* tone response curve (which is full of controversy too and based on my understanding isn’t anywhere as useful or necessary as some would suggest)?

It is in fact the L* tone response curve which I have encountered when profiling the monitor.

Kind regards - Hening.
Logged

ejmartin
Sr. Member
****
Offline Offline

Posts: 575


« Reply #42 on: February 26, 2010, 06:15:39 PM »
ReplyReply

Quote from: crames
Lightness and color are not fully separated in Lab. One of the biggest disadvantages of Lab as an editing space is that changes in the L channel affect color saturation. Increasing L reduces saturation and decreasing L increases saturation, in a way that looks unnatural to me.

Cliff

Is there a good representation in which they are fully separated?
Logged

emil
Schewe
Sr. Member
****
Offline Offline

Posts: 5255


WWW
« Reply #43 on: February 26, 2010, 08:30:31 PM »
ReplyReply

Quote from: joofa
I think I have a few books from Bruce Fraser at home and I shall go and recheck them, but what is quoted of his writings here on this forum, it appears, that Bruce is heaping criticism on Lab space, many of which, in more modern specs, have been assimilated into a theoretical model with Lab (think color appearance models, CAMs, in conjunction with Lab). In theory, it does not matter if Photoshoop does not have them, and an author should point that out that its Photoshop responsibility to modernize and not necessarily the fault of a particular space.

As a friend and colleague of Bruce let me weight in here...

First off, quoting Bruce is by definition a glimpse in history since Bruce has not had the opportunity to revise his opinions since 2006 when he passed away...

On the other hand, nothing about what Bruce had written has changed regarding the 800lbs gorilla in the room, Photoshop. I don't care if "technically" things have changed in specifications or new concepts, the fact is, Photoshop's Lab hasn't changed since the beginning of Photoshop's implementation of Lab–please, correct me if I'm wrong...

Lab has it's uses...less so when dealing with reasonable digital capture (much to the chagrin of Dan Margulis who STILL advocates processing digital captures through Camera Raw with zero image adjustments because, well, Camera Raw isn't useful for professionals).

The fact that Adobe and the Photoshop engineers (who are a pretty smart group) haven't seen a reason or benefit from radically changing Photoshop's implementation of something is telling volumes...

Use Lab for what it's good for–but honestly I have never seen a digital capture that COULDN'T be corrected in RGB either in ACR or Photoshop–but really, it's not very useful to wave your hands and claim some sort of mystical capabilities of Lab.
« Last Edit: February 27, 2010, 01:13:17 AM by Schewe » Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 6916


WWW
« Reply #44 on: February 27, 2010, 01:06:16 AM »
ReplyReply

Hi,

Outputting to sRGB removes the colors falling outside the sRGB gamut. The advantage you keep as much information as possible. Also, whatever you do in ACR or Lightroom (which shares the same processing engine) is guaranteed to be done in the right order, at least according to the views held by Adobe.

Actually, there are a few other issues. ACR/LR does not use Pro Photo RGB but it uses "Pro Photo Primaries in a linear space". So it has the same gamut as Pro Photo RGB but no gamma (or gamma equal one). There are quite a few arguments in favor of editing color as long as possible in linear gamma.

In most cases the differences will be subtle. Dan Margulis, a well know authority on image processing, has the view that 16 bit processing is not needed. Most other image processing experts say that using more bits is beneficial. The way I see it, it's a good approach to keep as much information as possible. Of course, having a parametric workflow based on raw images essentially means that nothing is lost, except in the final stage, when a picture is processed for it's intended use.

Best regards
Erik


Quote from: Ishmael.
I don't mean to beat this horse to death but I am still not clear on one issue: is it wise to edit in Camera Raw using Pro Photo/16bit  and then convert to sRGB/8bit when I'm saving JPEGs for the web? Or is the conversion back to sRGB just going to undo whatever advantages Pro Photo gave me?
Logged

ejmartin
Sr. Member
****
Offline Offline

Posts: 575


« Reply #45 on: February 27, 2010, 01:33:36 AM »
ReplyReply

Quote from: ErikKaffehr
There are quite a few arguments in favor of editing color as long as possible in linear gamma.

And those are...?
Logged

emil
ErikKaffehr
Sr. Member
****
Online Online

Posts: 6916


WWW
« Reply #46 on: February 27, 2010, 05:27:32 AM »
ReplyReply

Hi,

I need to check out this. I'm pretty sure I have seen that argument made, but I'm not really sure.

Best regards
Erik


Quote from: ejmartin
And those are...?
Logged

crames
Full Member
***
Offline Offline

Posts: 210


WWW
« Reply #47 on: February 27, 2010, 09:02:11 AM »
ReplyReply

Quote from: ejmartin
Is there a good representation in which they are fully separated?
The Chromaticity representation: xyY. This is where color is converted (using the color profile info) to tristimulus values X,Y,Z, and x = X/(X+Y+Z), y = Y/(X+Y+Z). Since little x and y are ratios, if you change Y (Luminance) then X and Z will change proportionately.

A way to separate them in CIELAB might be to divide the a and b coordinate by L, then multiply a and b with the new L after L is changed.

There is also a relative of CIELAB:  CIELUV, which has a true "saturation" correlate, but is probably not a good editing space.

Saturation is the key: article by R.W.G. Hunt
« Last Edit: February 27, 2010, 09:02:42 AM by crames » Logged

Cliff
joofa
Sr. Member
****
Offline Offline

Posts: 485



« Reply #48 on: February 27, 2010, 09:10:47 AM »
ReplyReply

Quote from: Schewe
First off, quoting Bruce is by definition a glimpse in history since Bruce has not had the opportunity to revise his opinions since 2006 when he passed away...

I did go back and checked out a few books from Bruce Fraser that I have. It has been a while I have looked at them, and I must say that I admired his writings. It would appear to me that Bruce's view point was certainly more comprehensive and not quite narrowly focused as it appears from some of the clippings of his writings quoted in this thread. However, it seems he was more interested in practical issues such as making profiles, etc., and not going into details of color science. But that is fine considering the audience he was targeting.

Quote from: Schewe
The fact that Adobe and the Photoshop engineers (who are a pretty smart group) haven't seen a reason or benefit from radically changing Photoshop's implementation of something is telling volumes...

I don't know why you think that if Photoshop is not doing something then it means they thought it was technically worthless. There are a few other reasons why industry doesn't do many things that seem technically "correct", which could be applicable here, and they are (1) if public is happy with a product then why unnecessarily change a product, and, (2), Photoshop is product that many (most??) people use to make visually pleasing images and not necessarily technically or scientifically correct images.

Consider an example: The NTSC coefficients for converting to grayscale, i.e., 0.299*R + 0.587*G + 0.114*B, are well-known. However, the R,G, and B used typically are non-linear (gamma-corrected) and the coefficients {0.299, 0.587, 0.114}, actually are derived for linear R, G and B. However, there is a certain amount of research NTSC did into why use the same coefficients even in the "technically incorrect" setting of nonlinear, RGB. IIRC, some SMPTE publications have used similar set of coefficients, that were not "technically matched", to the primaries, but perhaps they used for either historical reasons, or they resulted in visually pleasing images.
« Last Edit: February 27, 2010, 09:13:05 AM by joofa » Logged

Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
crames
Full Member
***
Offline Offline

Posts: 210


WWW
« Reply #49 on: February 27, 2010, 09:25:45 AM »
ReplyReply

Quote from: ejmartin
And those are...?
Linear gamma demo by Helmut Dersch (Panorama Tools)
« Last Edit: February 27, 2010, 09:28:39 AM by crames » Logged

Cliff
digitaldog
Sr. Member
****
Online Online

Posts: 8036



WWW
« Reply #50 on: February 27, 2010, 09:44:05 AM »
ReplyReply

Quote from: Schewe
First off, quoting Bruce is by definition a glimpse in history since Bruce has not had the opportunity to revise his opinions since 2006 when he passed away...

And yet, at least in terms of what he wrote about Lab above, I don’t believe anything at all has changed, his points are as valid today as they were the day he wrote about them.

There may be better color appearance models today than in 2006 although I’m not privy to this being a fact. And Bruce was fully aware of this development, as I have posts he made about how such models would be better than using Lab.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
joofa
Sr. Member
****
Offline Offline

Posts: 485



« Reply #51 on: February 27, 2010, 11:11:30 AM »
ReplyReply

Quote from: digitaldog
There may be better color appearance models today than in 2006 although I’m not privy to this being a fact. And Bruce was fully aware of this development, as I have posts he made about how such models would be better than using Lab.

The issue is not necessarily that Lab space has problems. Many of the problems are well-known. I think CIE's original intention in promoting Lab space was a relatively uniform color space, meant for the specification of color differences in some controlled situations, and not a color appearance space.  The point I am trying to make is that while authors are aware of problems with Lab space, they might not have discussed how to resolve some of them, i.e, some of the efforts that are underway to construct some predictors of color appearance attributes. Bruce Fraser has produced a large body of work and I have not read all of it. I have seen only some of it so my analysis is based upon that part, and it is possible that he may have addressed these issue elsewhere that I have not had access to yet. For e.g., the blue/purple discrimination issue is being addressed in various color difference formulae, which though are still not perfect, but an effort is being made to assimilate them. Similarly, the issue of using "incorrect" normalization of XYZ in Lab, may be seen as a full matrix, instead of a diagonal matrix. ETC.

There are some aspects of color appearance that Lab is incapable of handling. However, the Lab space should be used as a simple model that may be utilized as a benchmark to measure the improvements of more sophisticated models.
« Last Edit: February 27, 2010, 12:22:26 PM by joofa » Logged

Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
digitaldog
Sr. Member
****
Online Online

Posts: 8036



WWW
« Reply #52 on: February 27, 2010, 11:21:58 AM »
ReplyReply

Quote from: joofa
The issue is not necessarily that Lab space has problems.
But it does.
Quote
Many of the problems are well-known.
Yes they are. But the two sentences above from you seem to contradict each other.
Quote
I think CIE's original intention in promoting Lab space was a relatively uniform color space, meant for the specification of color differences in some controlled situations, and not a color appearance space.
Agreed. That’s exactly what Bruce wrote. And even then, there are some issues which he points out too.
Quote
The point I am trying to make is that while authors, such as Bruce Fraser, are aware of problems with Lab space, they do not discuss how to resolve some of them, i.e, some of the efforts that are underway to construct some predictors of color appearance attributes.
The problems with Lab can’t be resolved. Having more robust color appearance models could but they either don’t exist or don’t exist in any products we can use. So its like saying we need anti-gravity machines. That be cool. Until such technology exists, what we need and want is kind of moot. And in the context of this series of discussions around Lab editing, nothing is changed.
Quote
There are some aspects of color appearance that Lab is incapable of handling. However, the Lab space should be used as a simple model that may be utilized as a benchmark to measure the improvements of more sophisticated models.
Indeed. And the problem is, Lab isn’t a color appearance model or at least a very good one, which is the point Bruce makes.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
joofa
Sr. Member
****
Offline Offline

Posts: 485



« Reply #53 on: February 27, 2010, 11:28:33 AM »
ReplyReply

Quote from: digitaldog
The problems with Lab can’t be resolved.

In my previous message I have specific examples of a few problems that may be resolved to some extent.

Quote from: digitaldog
Having more robust color appearance models could but they either don’t exist or don’t exist in any products we can use. So its like saying we need anti-gravity machines. That be cool. Until such technology exists, what we need and want is kind of moot. And in the context of this series of discussions around Lab editing, nothing is changed.

DigitalDog, this is where we are running in circles. My point has been consistently not to base arguments on what is offered by current technology or products. I have been saying to make a distinction between theory and implementation of some portion of that theory in available products. In one of my messages above I have mentioned that products such as Photoshop may not necessarily need to modernize since they are used typically for making visually pleasing images and not necessarily scientifically or technically correct images.
« Last Edit: February 27, 2010, 11:29:25 AM by joofa » Logged

Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
crames
Full Member
***
Offline Offline

Posts: 210


WWW
« Reply #54 on: February 27, 2010, 11:47:19 AM »
ReplyReply

Quote from: digitaldog
The problems with Lab can't be resolved. Having more robust color appearance models could but they either don't exist or don't exist in any products we can use. So its like saying we need anti-gravity machines. That be cool. Until such technology exists, what we need and want is kind of moot. And in the context of this series of discussions around Lab editing, nothing is changed.
Here is your anti-gravity machine: CIECAM02 Plugin  

But sorry, only for Windows....
Logged

Cliff
joofa
Sr. Member
****
Offline Offline

Posts: 485



« Reply #55 on: February 27, 2010, 12:49:40 PM »
ReplyReply

Quote from: crames
Here is your anti-gravity machine: CIECAM02 Plugin

Wow, Cliff, you have done some very interesting work!

Logged

Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #56 on: February 27, 2010, 08:08:09 PM »
ReplyReply

Quote from: digitaldog
The problems with Lab can’t be resolved. Having more robust color appearance models could but they either don’t exist or don’t exist in any products we can use. So its like saying we need anti-gravity machines. That be cool. Until such technology exists, what we need and want is kind of moot. And in the context of this series of discussions around Lab editing, nothing is changed.

Indeed. And the problem is, Lab isn’t a color appearance model or at least a very good one, which is the point Bruce makes.
Of course, unmentioned in this discussion, is the fact that ProPhotoRGB is not a color appearance model either. As this old reference to CIECAM97 points out, a color appearance model starts out with a tristimulus value and takes viewing conditions, background, and other factors into account in order to predict the appearance of the color of the object under the specified viewing conditions. The source tristimulus values could be expressed in either L*a*b or ProphotoRGB. Bruce Lindbloom has a calculator that can be used to convert between XYZ, L*a*b and various RGB spaces.  No one is saying that ProPhotoRGB is not suitable for editing images.

That CIE L*a*b was developed merely to quantify differences in color is somewhat disingenuous. It is true that it was developed as a perceptually uniform space where a given distance between colors would be perceptually uniform, but it was derived from the the CIE 1931 XYZ color space and inherits the attributes of that space. Under the specified viewing conditions, L*a*b coordinates will accurately predict the appearance of a color just the same as with the original 1931 scheme. Problems arise when the these conditions are not met.
Logged
digitaldog
Sr. Member
****
Online Online

Posts: 8036



WWW
« Reply #57 on: February 27, 2010, 08:51:53 PM »
ReplyReply

Quote from: bjanes
Of course, unmentioned in this discussion, is the fact that ProPhotoRGB is not a color appearance model either.
No, its not, its an RGB working space (which apparently isn’t obvious).
Quote
That CIE L*a*b was developed merely to quantify differences in color is somewhat disingenuous. It is true that it was developed as a perceptually uniform space where a given distance between colors would be perceptually uniform, but it was derived from the the CIE 1931 XYZ color space and inherits the attributes of that space.

Except it isn’t fully perceptually uniform. But yes, its design was to predict (report) color differences with a numeric value, as Bruce quotes above. Not as an editing space (again, as Bruce mentioned). At the time, Photoshop and such image processing was the realm of science fiction (let alone a task anyone at the time even contemplated).
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
digitaldog
Sr. Member
****
Online Online

Posts: 8036



WWW
« Reply #58 on: February 27, 2010, 09:04:18 PM »
ReplyReply

Quote from: bjanes
Under the specified viewing conditions, L*a*b coordinates will accurately predict the appearance of a color just the same as with the original 1931 scheme. Problems arise when the these conditions are not met.

Keep in mind that Lab was just an attempt to create a perceptually uniform color space where equal steps correlated to equal color closeness based on the perception of a viewer. The CIE didn't claim it was prefect (cause its not). Most color scientists will point out that Lab exaggerates the distance in yellows thereupon it underestimate the distances in the blues. Lab assumes that hue and chroma can be treated separately. There's an issue where hue lines bend with increase in saturation perceived by viewers as an increase in both saturation and a change in hue when that's really not supposed to be accruing. Further, according to Karl Lang, there is a bug in the definition of the Lab color space. If you are dealing with a very saturated blue that's outside the gamut of say a printer, when one uses a perceptual rendering intent, the CMM preserves the hue angle and reduces the saturation in an attempt to make a less saturated blue within this gamut. The result is mathematically the same hue as the original, but the results end up appearing purple to the viewer. This is unfortunately accentuated with blues, causing a well recognized shift towards magenta. And as I alluded above, its important to keep in mind that the Lab color model was invented way back in 1976, long before anyone had thoughts about digital color management.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
digitaldog
Sr. Member
****
Online Online

Posts: 8036



WWW
« Reply #59 on: February 27, 2010, 09:09:59 PM »
ReplyReply

Quote from: bjanes
No one is saying that ProPhotoRGB is not suitable for editing images.

It most certainly can be (its not prefect and there are caveats). For one, you can define “colors” with numeric values that we can’t see (hence, they are not colors).
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Pages: « 1 2 [3] 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad