BufferBlend

Trait BufferBlend 

Source
pub trait BufferBlend<P, Container>
where P: Pixel, Container: Deref<Target = [P::Subpixel]> + AsRef<[P::Subpixel]>,
{ // Required method fn blend( &mut self, other: &ImageBuffer<P, Container>, op: fn(f64, f64) -> f64, apply_to_color: bool, apply_to_alpha: bool, ) -> Result<(), Error>; }

Required Methods§

Source

fn blend( &mut self, other: &ImageBuffer<P, Container>, op: fn(f64, f64) -> f64, apply_to_color: bool, apply_to_alpha: bool, ) -> Result<(), Error>

Blend other into self using the function op, where arg 0 is self and 1 is other.

Handles type conversion and alpha channel detection and placement automatically.

You may blend a luma image into an rgb image (in which case the luma image will be treated as a grayscale rgb image), but you cannot blend an rgba image into a luma image.

If other has an alpha channel, the output is weighted by this alpha channel (so if alpha for other for this pixel is 0.5, the blend effect will be 0.5 as strong)

§Arguments

Use apply_to_color and apply_to_alpha to control which channels are affected.

If apply_to_alpha is true but self or other does not have an alpha channel, this option has no effect.

op is a function that takes two f64 values and returns a f64 value. (e.g. |self, other| self + other)

Standard blend modes such as those found in photoshop are provided as functions (e.g. pixel_add, pixel_mult, etc.).

The values are normalized to the range 0.0..1.0 before blending, and then scaled back to the input type’s range.

The output from op is automatically clamped from 0.0..1.0 before being converted back to the input type so you don’t need to worry about overflow/underflow.

§Errors

DimensionMismatch: self and other have different dimensions

UnsupportedBlend: self is a luma image and other is an rgb image

§Examples
§Example 1:

Using the pixel_mult function to blend two images together:

use image::open;
use image_blend::BufferBlend;
use image_blend::pixelops::pixel_mult;

// Load an image
let mut img1_dynamic = open("test_data/1.png").unwrap();
let mut img1_buffer = img1_dynamic.as_mut_rgba8().unwrap();

// Load another image
let img2_dynamic = open("test_data/2.png").unwrap();
let img2_buffer = img2_dynamic.to_rgba16();

// Blend the images using the pixel_mult function
img1_buffer.blend(&img2_buffer, pixel_mult, true, false).unwrap();
img1_buffer.save("tests_out/doctest_buffer_blend_result.png").unwrap();
§Example 2:

Using a custom function to blend two images together:

use image::open;
use image_blend::BufferBlend;

let closest_to_gray = |a: f64, b: f64| {
    let a_diff = (a - 0.5).abs();
    let b_diff = (b - 0.5).abs();
    if a_diff < b_diff {
        a
    } else {
        b
    }
};

// Load an image
let mut img1_dynamic = open("test_data/1.png").unwrap();
let mut img1_buffer = img1_dynamic.as_mut_rgba8().unwrap();

// Load another image
let img2_dynamic = open("test_data/2.png").unwrap();
let img2_buffer = img2_dynamic.to_rgba16();

// Blend the images using our custom function
img1_buffer.blend(&img2_buffer, closest_to_gray, true, false).unwrap();
img1_buffer.save("tests_out/doctest_buffer_custom_result.png").unwrap();

Implementations on Foreign Types§

Source§

impl<P, Pmut, Container, ContainerMut> BufferBlend<P, Container> for ImageBuffer<Pmut, ContainerMut>
where Pmut: Pixel, P: Pixel, Container: Deref<Target = [P::Subpixel]> + AsRef<[<P as Pixel>::Subpixel]>, ContainerMut: DerefMut<Target = [Pmut::Subpixel]> + AsMut<[<Pmut as Pixel>::Subpixel]>,

Source§

fn blend( &mut self, other: &ImageBuffer<P, Container>, op: fn(f64, f64) -> f64, apply_to_color: bool, apply_to_alpha: bool, ) -> Result<(), Error>

Implementors§