[][src]Struct scarlet::color::XYZColor

pub struct XYZColor {
    pub x: f64,
    pub y: f64,
    pub z: f64,
    pub illuminant: Illuminant,
}

A point in the CIE 1931 XYZ color space. Although any point in XYZ coordinate space is technically valid, in this library XYZ colors are treated as normalized so that Y=1 is the white point of whatever illuminant is being worked with.

Fields

x: f64

The X axis of the CIE 1931 XYZ space, roughly representing the long-wavelength receptors in the human eye: the red receptors. Usually between 0 and 1, but can range more than that.

y: f64

The Y axis of the CIE 1931 XYZ space, roughly representing the middle-wavelength receptors in the human eye. In CIE 1931, this is fudged a little to correspond exactly with perceived luminance, so while this doesn't exactly map to middle-wavelength receptors it has a far more useful analogue.

z: f64

The Z axis of the CIE 1931 XYZ space, roughly representing the short-wavelength receptors in the human eye. Usually between 0 and 1, but can range more than that.

illuminant: Illuminant

The illuminant that is assumed to be the lighting environment for this color. Although XYZ itself describes the human response to a color and so is independent of lighting, it is useful to consider the question "how would an object in one light look different in another?" and so, to contain all the information needed to track this, the illuminant is set. See the color_adapt() method to examine how this is used in the wild.

Methods

impl XYZColor[src]

pub fn color_adapt(&self, other_illuminant: Illuminant) -> XYZColor[src]

Converts from one illuminant to a different one, such that a human receiving both sets of sensory stimuli in the corresponding lighting conditions would perceive an object with that color as not having changed. This process, called chromatic adaptation, happens subconsciously all the time: when someone walks into the shade, we don't interpret that shift as their face turning blue. This process is not at all simple to compute, however, and many different algorithms for doing so exist: it is most likely that each person has their own idiosyncrasies with chromatic adaptation and so there is no perfect solution. Scarlet implements the Bradford transform, which is generally acknowledged to be one of the leading chromatic adaptation transforms. Nonetheless, for exact color science work other models are more appropriate, such as CIECAM02 if you can measure viewing conditions exactly. This transform may not give very good results when used with custom illuminants that wildly differ, but with the standard illuminants it does a very good job.

Example: The Fabled Dress

The most accessible way of describing color transformation is to take a look at this image, otherwise known as "the dress". This showcases in a very apparent fashion the problems with very ambiguous lighting in chromatic adaptation: the photo is cropped to the point that some of the population perceives it to be in deep shade and for the dress to therefore be white and gold, while others perceive instead harsh sunlight and therefore perceive it as black and blue. (For reference, it is actually black and blue.) Scarlet can help us answer the question "how would this look to an observer with either judgment about the lighting conditions?" without needing the human eye! First, we use a photo editor to pick out two colors that represent both colors of the dress. Then, we'll change the illuminant directly (without using chromatic adaptation, because we want to actually change the color), and then we'll adapt back to D65 to represent on a screen the different colors.

let dress_bg = RGBColor::from_hex_code("#7d6e47").unwrap().to_xyz(Illuminant::D65);
let dress_fg = RGBColor::from_hex_code("#9aabd6").unwrap().to_xyz(Illuminant::D65);
// proposed sunlight illuminant: daylight in North America
// We could exaggerate the effect by creating an illuminant with greater Y value at the white
// point, but this will do
let sunlight = Illuminant::D50;
// proposed "shade" illuminant: created by picking the brightest point on the dress without
// glare subjectively, and then treating that as white
let shade_white = RGBColor::from_hex_code("#b0c5e4").unwrap().to_xyz(Illuminant::D65);
let shade = Illuminant::Custom([shade_white.x, shade_white.y, shade_white.z]);
// make copies of the colors and set illuminants
let mut black = dress_bg;
let mut blue = dress_fg;
let mut gold = dress_bg;
let mut white = dress_fg;
black.illuminant = sunlight;
blue.illuminant = sunlight;
gold.illuminant = shade;
white.illuminant = shade;
// we can just print them out now: the chromatic adaptation is done automatically to get back
// to the color space of the viewing monitor. This isn't exact, mostly because the shade
// illuminant is entirely fudged, but it's surprisingly good
let black_rgb: RGBColor = black.convert();
let blue_rgb: RGBColor = blue.convert();
let gold_rgb: RGBColor = gold.convert();
let white_rgb: RGBColor = white.convert();
println!("Black: {} Blue: {}", black_rgb.to_string(), blue_rgb.to_string());
println!("Gold: {}, White: {}", gold_rgb.to_string(), white_rgb.to_string());

pub fn approx_equal(&self, other: &XYZColor) -> bool[src]

Returns true if the given other XYZ color's coordinates are all within acceptable error of each other, which helps account for necessary floating-point errors in conversions. To test whether two colors are indistinguishable to humans, use instead Color::visually_indistinguishable.

Example

let xyz1 = XYZColor{x: 0.3, y: 0., z: 0., illuminant: Illuminant::D65};
// note that the difference in illuminant won't be taken into account
let xyz2 = XYZColor{x: 0.1 + 0.1 + 0.1, y: 0., z: 0., illuminant: Illuminant::D55};
// note that because of rounding error these aren't exactly equal!
assert!(xyz1.x != xyz2.x);
// using approx_equal, we can avoid these sorts of errors
assert!(xyz1.approx_equal(&xyz2));

pub fn approx_visually_equal(&self, other: &XYZColor) -> bool[src]

Returns true if the given other XYZ color would look identically in a different color space. Uses an approximate float equality that helps resolve errors due to floating-point representation, only testing if the two floats are within 0.001 of each other.

Example

assert!(XYZColor::white_point(Illuminant::D65).approx_visually_equal(&XYZColor::white_point(Illuminant::D50)));

pub fn white_point(illuminant: Illuminant) -> XYZColor[src]

Gets the XYZColor corresponding to pure white in the given light environment.

Example

let white1 = XYZColor::white_point(Illuminant::D65);
let white2 = XYZColor::white_point(Illuminant::D50);
assert!(white1.approx_visually_equal(&white2));

Trait Implementations

impl Clone for XYZColor[src]

impl Color for XYZColor[src]

impl Copy for XYZColor[src]

impl Debug for XYZColor[src]

impl PartialEq<XYZColor> for XYZColor[src]

impl StructuralPartialEq for XYZColor[src]

Auto Trait Implementations

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T> ToOwned for T where
    T: Clone
[src]

type Owned = T

The resulting type after obtaining ownership.

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.