Speaking of screenshots, I learned something about OS X color management over the summer when I was working on a web coding editor in my spare time.
The editor is basically a tricked out code editor for web programmers. It’s geared toward PHP/Ruby guys, and I tried to keep the interface as minimal as possible. Most of the value of the product is in language intelligence, and so most of the features are either done through fancy code hints or heads up displays.
Of course, because this is for web stuff, there are features for HTML and CSS. The way you pick colors is, essentially, that the mac color picker is the “code hint” that shows up when your cursor is inside of a color value. Meanwhile, the color you pick gets used in the code coloring, so you can see the colors in your CSS file just by reading the text.
It actually makes more sense when you see it live. The text and color updates in real time as you move the sliders.
Well, anyway, I kept battling this bug where the color displayed in the editor did not exactly match the color chosen in the color picker. Well, it turns out that when color management is on, there is basically no way to get this 100% right. Either you’re inconsistent with one thing or you’re inconsistent with another.
I kept trying different combinations of converting between so called “device color” and the generic color space, until I realized something.
Even the Apple OS doesn’t deal with it consistently. ARRGH!
The browser renders #00ff00 as R:0, G:255, B:0. On my monitor, the color picker renders #00ff00 as R:137, G:250, B:0
What is going on?
This is all a byproduct of color management, which all makes sense once you untangle everything, but boy, is it confusing.
I had never thought much about color management before, but I guess I imagined it something like this:
1) Apps draw into windows
2) OS X composites together the entire screen
3) Color management is applied globally at the very end, either by double buffering the entire screen buffer, or perhaps through hardware magic.
Instead, it happens like this:
1) Apps draw into windows. Each app decides whether it’s color managed or not.
2) When apps draw a specific color (such as #00ff00), it may end up with different pixel values after color management.
3) These bitmaps are then composited by OSX, and sent to the screen.
I’m sure there are lots of other complexities (e.g., the color spaces embeded inside of images. Ack! I just use sRGB and hope for the best!).
So now what? Who cares? And isn’t this post about screenshots?
Well, for starters, as far as I can tell, there is no way to take a screenshot of an OS X window without having color management affect the pixel values of the UI.
Let’s say I want to take a screenshot of a finder window to post to the web. If I take the screenshot on my monitor, it will look one way. If I take the screenshot on another monitor, it will be a different color. ARGH!
Now, as it turns out, I turned on color management in Firefox (because I don’t want photographs to look oversaturated on my wide gamut monitor) but the price of that you can get into some weird territory. For example, as I look at the screenshot I took above of those OS X windows from within Firefox, so the color management transforms are being applied AGAIN even though the screenshot has already had them applied, leading to weird colors
ACK! So.. does anyone know the “right” way to take screenshots under color management?