Learn Color Theory
Master color with interactive guides.
Beyond the everyday color models (RGB, HSL, CMYK), a rich world of color science underpins how we measure, perceive, and reproduce color. These spaces and models are used in research, video, print, and next-generation display technologies.
XYZ (CIE 1931)
The foundational, device-independent color space
CIE 1931 XYZ is the master color space from which nearly all others are derived. It was built from experiments measuring how the average human observer matches colors, and it defines color independently of any device.
Key Properties
- X — A luminance-weighted mix related to red-sensitivity
- Y — Luminance (brightness as perceived by humans)
- Z — Roughly blue-sensitivity
XYZ is the intermediate step whenever PerfectPalette converts between RGB and LAB. Every serious color conversion eventually passes through XYZ coordinates — it is the lingua franca of color science.
The Mother of All Color Spaces
LAB, LCH, OKLab, OKLCH, and every ICC profile are built on top of XYZ. The famous “horseshoe diagram” you see in color science is the XYZ chromaticity plot — it maps the entire range of colors visible to humans, with the sRGB triangle showing what your monitor can display.
YUV / YCbCr
Luminance-chrominance for video and broadcast
YUV and its digital variant YCbCr separate luminance (Y — brightness) from chrominance(U/V or Cb/Cr — color information). This was originally designed for backward-compatible color TV broadcasts and is now the basis of every video codec.
Key Properties
- Y — Luma (perceived brightness)
- Cb — Blue-difference chrominance
- Cr — Red-difference chrominance
Human vision is more sensitive to brightness than color detail. Video codecs exploit this by storing luma at full resolution and subsampling chroma (4:2:2 or 4:2:0), achieving 50% compression with minimal visible quality loss.
Why Your TV Understands Contrast
Broadcast engineers separated brightness from color to make color TV signals backward-compatible with black-and-white sets. This same principle explains why WCAG contrast calculations use relative luminance rather than raw color differences — our eyes resolve brightness detail far better than color detail.
HCL (Hue-Chroma-Lightness)
LCH with reordered initials — the data visualization standard
HCL is the same space as LCH(CIE L*C*h°) with the initials reordered to emphasize Hue first. It is the go-to color space for scientific visualization and data-driven color ramps.
Libraries like D3.js (d3.hcl()) and R's ggplot2 use HCL to produce perceptually uniform color scales — ensuring that a gradient from blue to red looks equally spaced to human eyes, not just mathematically.
Want to explore interactively? Switch to the LCH tab in Color Modes above.
The Data Viz Standard
When you see a well-designed heatmap or choropleth, it likely uses HCL/LCH under the hood. Perceptually uniform color ramps prevent misleading visual artifacts — ensuring your audience reads the data, not the color encoding flaws.
IPT
Perceptual space with superior hue uniformity
IPT was designed specifically for hue uniformity. Where CIE LAB can shift perceived hue when you change lightness (especially in blues and purples), IPT keeps hue lines straight — making it ideal for gamut mapping.
Key Properties
- I — Intensity (lightness), similar to LAB L*
- P — Protan axis (red-green, roughly a*)
- T — Tritan axis (yellow-blue, roughly b*)
Better Hue Uniformity Than LAB
In LAB, changing L* from 50 to 70 on a blue color can cause the hue to shift toward purple. IPT was built from experimental data to keep hue lines parallel across all lightness levels — when you lighten a blue, it stays blue. This property inspired the design of OKLab.
ICtCp
The HDR video standard for next-generation displays
ICtCp is the perceptual color space designed for HDR content and the BT.2100 broadcast standard. It encodes Intensity (I), blue-yellow (Ct), and red-green (Cp) in a way optimized for the wider BT.2020 gamut and high dynamic range.
Key Properties
- I — Intensity (perceptual brightness for HDR)
- Ct — Tritan-like (blue-yellow chrominance)
- Cp — Protan-like (red-green chrominance)
As displays evolve from sRGB to Display P3 and eventually BT.2020, palette tools will need to account for wider gamuts. ICtCp provides the perceptual foundation for ensuring colors look correct on next-generation HDR screens.
The HDR Standard
Dolby Vision and BT.2100 HDR content use ICtCp internally. Unlike older spaces, ICtCp handles the extreme luminance range of HDR (up to 10,000 nits) while maintaining perceptual uniformity — something LAB was never designed to do.
Spectral Color
Wavelength-based color — the ground truth of light
All color models are approximations. Spectral color represents light by its actual wavelength distribution— the full reflectance or emission curve across 380–780nm. This is how color exists in the physical world before any sensor or screen interprets it.
How It Works
- Each surface has a reflectance curve (how much light it reflects at each wavelength)
- Each light source has an emission spectrum (its wavelength distribution)
- The perceived color depends on both: surface × illuminant → XYZ → RGB
Real Pigment Physics
When you mix red and blue paint, you don't get the same result as mixing red and blue light. That's because paint mixing is spectral (subtractive wavelength interaction), not additive RGB math. Spectral models are the frontier for realistic digital paint simulation and accurate material rendering.
CAM Models (CIECAM02 / CAM16)
Color Appearance Models — how viewing conditions change what you see
Color Appearance Models go beyond colorimetry to model how viewing conditionsaffect perceived color. CIECAM02 and its successor CAM16 account for surround brightness, illumination, and chromatic adaptation — the same color chip looks different under warm tungsten vs. cool daylight.
Key Attributes
- J — Lightness (adaptation-corrected)
- C — Chroma (colorfulness relative to white)
- h — Hue angle
- Q — Brightness (absolute, not relative)
- M — Colorfulness (absolute chroma)
- s — Saturation (chroma relative to brightness)
CAM models answer questions like: “How would this palette look in a dimly-lit room?” or “Will these colors still work under fluorescent office lighting?” This is the most advanced frontier of applied color science.
Why the Same Color Looks Different
Your brain continuously adapts to lighting conditions — a white sheet of paper looks white under both warm and cool lights, even though the actual wavelengths hitting your eye are very different. CAM models simulate this adaptation, enabling tools to predict how users will perceive a palette in real-world environments.