ColorSpace for CGI Artist, Part II : ACES(cg)

In the previous article we reviewed how to work in a linear color space. This is a very important step if you want to achieve photorealistic render. But this is not enough, we will see why, and introduce ACES to improve even further our render.

A few note about this article and the glossary I will use. When I mean the word sRGB, I will mean linear color space with sRGB primaries displayed with sRGB Profile, when I will use the ACES, I will mean linear color space with ACES primaries AP1 displayed with an ACES sRGB OutputTransform.

This means that you always end up displaying in sRGB as your monitor are made to support only sRGB. The ACES Output transform will convert the resulting color so you can display them on your monitor, like we will do with cinematic cameras.

Limitations of sRGB

From wikipedia :

sRGB (standard Red Green Blue) is an RGB color space that HP and Microsoft created cooperatively in 1996 to use on monitors, printers, and the Internet. It was subsequently standardized by the IEC as IEC 61966-2-1:1999.[1] It is often the “default” color space for images that contain no color space information, especially if the images’ pixels are stored in 8-bit integers per color channel. sRGB uses the ITU-R BT.709 primaries, the same as in studio monitors and HDTV,[2] a transfer function (gamma curve) typical of CRTs, and a viewing environment designed to match typical home and office viewing conditions. This specification allowed sRGB to be directly displayed on typical CRT monitors of the time, which greatly aided its acceptance.

sRGB have been invented quite a long time ago, when we were still using CRT monitors. It became a standard and today we still leave in its legacy. This standard was needed but exibit a few caveats that was not a problem at the time because of the usage of CRT monitors and also because achieving photorealistic render demands also more colormetry accuracy.

Restricted Gamut

The color gamut describes a range of color within the spectrum of colors that are identifiable by the human eye (visible color spectrum).

As you can see, from the full range of color visible by our eye, only a small portion can be rendered with the sRGB standard. This may not a be a problem when you look at a picture but this could affect the color calculation within your shader which obviously contribute to the final render.

Remember the picture you take with your camera is the result of complex shading and finally just a portion is converted to your monitor profile. This is why a picture looks different from what you see with your eyes.

Restricted Dynamic range

The second restriction towards photorealistic render is the dynamic range provided by sRGB profile.

Dynamic range is very important to our perception. Even if you capture reality with actual camera you can spot the different image quality between an image captured with a smartphone, a DSLR camera, a cinematic camera such as an Alexa and your eyes sight.

Compared to actual camera sRGB dynamic range is very limited. This a prime reason why your render will not look photorealistic even with physical based renderer.

The render above have been rendered around 2008 in renderman. At the time we didn’t have any physically based shader. Despite it the render looked photorealistic because we were aware about primaries and dynamic range. ( ACES didn’t exist so we had our homebrew colour system )

Below I have plotted the display curve for sRGB and ACES.

As you can see the slope are different. ACES curve will compress the highlights values a lot more. There will be more space to represent this information so your highlights will have a better definition. If you think ACES is applying a tonemapping you are correct.

This lack of dynamic range will affect different parameters :

  • The intensity of the light you will use will be lower than expect in a physically based render.
  • How do you paint or tweak your albedos maps ( often named diffuse )
  • How do you tweak your roughness ( or glossiness) in your shaders
  • How do you tweak your IOR in your shaders

Not having the correct intensity of your light in your scene will lead you to inaccuracy on how you set the different parameters. I have seen, and actually still see, talented artist tweaking their shaders because the physical value does not look right. In fact they were compensating their albedos, roughness or IOR based on what the sRGB curve was displaying.

It is not wrong per say but usually their assets needed tweak according to the shots or sequence mood. They were fixing something at the wrong place. With a better display you can use safely the physical value a material ( IOR, Albedo, etc… ) and it will looks as expect regardless the lighting of your scene.

What is ACES ?

The Academy Color Encoding System (ACES) is a color image encoding system created by hundreds of industry professionals under the auspices of the Academy of Motion Picture Arts and Sciences. ACES allows for a fully encompassing color accurate workflow, with “seamless interchange of high quality motion picture images regardless of source”.[1]

In others words it is a standard that will allow any users to see the same color regardless of the device used and the position in the chain. For instance, if your are on a set and you need to shoot with a combinaison of different camera ( eg. Alexa, red or gopro), you will need to conform the colorimetry of each camera to a single one, otherwise you will notice the different quality ( each camera got a different film reponse to the incoming light). ACES will conform every source including your CG to the same color space.

How can I use ACES ? OpenColorIO !

The easiest way to work with ACES is to use a color management called OpenColorIO from Sony. It is supported by many applications such as nuke, maya, krita, and most renderer framebuffer. So you will have to set up only once and all your software will be able to use this configuration. You can download it here. ( click on sample config )

OpenColorIO ( or OCIO)

Once you have downloaded OCIO and unpack the archive you will notice there is no software to install. OCIO comes with a series of look up table and a file named config.ocio for each collection.

This text file will specify which look up table to use using rules for every case. I will probably develop how to create your custom configuration on an another post.

First we will remove any folder that we don’t need. It will save you a lot of disk space.
Keep the aces_1.0.3 folder only.

This is the content of the OCIO download

From here you have two options to register ACES configuration.

Register ACES on each software

You will have to set the path of your config.ocio on every software supporting OpenColorIO. If the file is correct then new rules would be applied.

Register ACES on your system using environment variable

Setting OCIO as an environment variable

On Linux you will have to set your environment using either the command set or export before you lunch your software.

If it worked you should see multiple “ColorSpace” available.

Using ACES

The workflow will be pretty much the same as what described as “linear Workflow” in the previous post. Your texture input need to be in linear and you will need to display your render with a sRGB profile. The principle stay the same with ACES. The difference we will use ACEScg primaries instead of sRGB primaries for a better colour accuracy and we will use an ACES sRGB Outpur Transformto display our render.

I said sRGB Output transform because I assume you have a sRGB monitor. Others Output Transform are available if you are lucky enough to have others devices to display your images ( like a P3 or rec2020 monitor ).

By default there is a lot of colorspace available with ACES. A lot of them have to do with with reel camera. Also some of them are duplicate to be used as aliases.

From a cg artist point of view you will need to deal with a few of them :

  • raw
  • acescg
  • Utility – sRGB – Texture ( alias srgb_texture)
  • Utility – linear – sRGB ( alias lin_srgb )

raw means you do not perform any transformation. If you want to use this one when you are dealing with data driven map such as displacement, normal map, bump map, mask, roughness.

acescg is the linear color space with aces primaries. You will need to use this colorspace if your color map such as diffuse, albedo or SSS are already been generated with aces primaries.

Utility – sRGB – texture will convert your sRGB picture in linear color space and your sRGB primaries to aces primaries. This is useful when you source from a still camera. If you don’t convert the primaries it will looks wrong. Note this conversion will also “compress” your white value.

Utility – linear – sRGB this will transform only your primaries. This is useful if you have to deal with texture that are already in linear space but with sRGB primaries. This also useful if you have a HDR Library because they are often been generated with sRGB primaries.

If you don’t convert your images to get the correct primaries then they will look off. If you try to display a picture or a texture done with sRGB primaries with ACES RTT they will look over saturated.

Take away

In this post we have

  • Established the different limitation of the usual linear workflow using sRGB primaries and sRGB profile
  • We got an introduction of ACES
  • We learned how to enable ACES with OpenColorIO
  • We learned that the workflow is quite similar of the linear workflow.
  • Finally we reviewed which color space are actually useful or cg artist.

Special thanks to Nick Shaw for his corrections on this post.


7 Replies to “ColorSpace for CGI Artist, Part II : ACES(cg)”

  1. Thanks Harry for the post. I am still wondering about the color picker in ACES. Do you know why it is ‘Output – Rec.709’ by default in the OCIO config ? Thanks !


    1. If you set your config.ocio using environment variable then it will automatically get enable with software supporting OCIO. Mari will automatically pick it up.

      For Substance there is partial support, you will need to generate a colour profile. I’ll post one later.

      Zbrush I am not sure there is any colour management, thus I will not recommend any texturing work.

      Photoshop does support only ICC profile so no native support. However, there is a plugin:


  2. “I have seen, and actually still see, talented artist tweaking their shaders because the physical value does not look right.”

    Amen! If you want to just automagically improve many lighting artist’s workflows it is to force them to work in a framebuffer with a highlight rolloff/’film curve’.

    I don’t think many artists realize how much of an image from a professional camera “clips” to white in linear space without a highlight compression/rolloff curve. As a result they grossly under light their scenes and then add too much fill trying to fit all of the dynamic range between 0.0 – 1.0 instead of realizing that Alexa and Red footage by default actually hard clip somewhere around 3.0 to 12.0 in linear light.

    Trying to squeeze realistic scene dynamic range into 0.0 to 1.0 means you end up lighting the scene like the set of a daytime talk show. It just looks imperceivably “off”.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s