Skip to main content

FilterStep

Enum FilterStep 

Source
pub enum FilterStep {
Show 65 variants Trim { start: f64, end: f64, }, Scale { width: u32, height: u32, algorithm: ScaleAlgorithm, }, Crop { x: u32, y: u32, width: u32, height: u32, }, Overlay { x: i32, y: i32, }, FadeIn { start: f64, duration: f64, }, FadeOut { start: f64, duration: f64, }, AFadeIn { start: f64, duration: f64, }, AFadeOut { start: f64, duration: f64, }, FadeInWhite { start: f64, duration: f64, }, FadeOutWhite { start: f64, duration: f64, }, Rotate { angle_degrees: f64, fill_color: String, }, ToneMap(ToneMap), Volume(f64), Amix(usize), ParametricEq { bands: Vec<EqBand>, }, Lut3d { path: String, }, Eq { brightness: f32, contrast: f32, saturation: f32, }, EqAnimated { brightness: AnimatedValue<f64>, contrast: AnimatedValue<f64>, saturation: AnimatedValue<f64>, gamma: AnimatedValue<f64>, }, ColorBalanceAnimated { lift: AnimatedValue<(f64, f64, f64)>, gamma: AnimatedValue<(f64, f64, f64)>, gain: AnimatedValue<(f64, f64, f64)>, }, Curves { master: Vec<(f32, f32)>, r: Vec<(f32, f32)>, g: Vec<(f32, f32)>, b: Vec<(f32, f32)>, }, WhiteBalance { temperature_k: u32, tint: f32, }, Hue { degrees: f32, }, Gamma { r: f32, g: f32, b: f32, }, ThreeWayCC { lift: Rgb, gamma: Rgb, gain: Rgb, }, Vignette { angle: f32, x0: f32, y0: f32, }, HFlip, VFlip, Reverse, AReverse, Pad { width: u32, height: u32, x: i32, y: i32, color: String, }, FitToAspect { width: u32, height: u32, color: String, }, GBlur { sigma: f32, }, CropAnimated { x: AnimatedValue<f64>, y: AnimatedValue<f64>, width: AnimatedValue<f64>, height: AnimatedValue<f64>, }, GBlurAnimated { sigma: AnimatedValue<f64>, }, Unsharp { luma_strength: f32, chroma_strength: f32, }, Hqdn3d { luma_spatial: f32, chroma_spatial: f32, luma_tmp: f32, chroma_tmp: f32, }, Nlmeans { strength: f32, }, Yadif { mode: YadifMode, }, XFade { transition: XfadeTransition, duration: f64, offset: f64, }, DrawText { opts: DrawTextOptions, }, SubtitlesSrt { path: String, }, SubtitlesAss { path: String, }, Speed { factor: f64, }, LoudnessNormalize { target_lufs: f32, true_peak_db: f32, lra: f32, }, NormalizePeak { target_db: f32, }, ANoiseGate { threshold_db: f32, attack_ms: f32, release_ms: f32, }, ACompressor { threshold_db: f32, ratio: f32, attack_ms: f32, release_ms: f32, makeup_db: f32, }, StereoToMono, ChannelMap { mapping: String, }, AudioDelay { ms: f64, }, ConcatVideo { n: u32, }, ConcatAudio { n: u32, }, FreezeFrame { pts: f64, duration: f64, }, Ticker { text: String, y: String, speed_px_per_sec: f32, font_size: u32, font_color: String, }, JoinWithDissolve { clip_a_end: f64, clip_b_start: f64, dissolve_dur: f64, }, OverlayImage { path: String, x: String, y: String, opacity: f32, }, Blend { top: Box<FilterGraphBuilder>, mode: BlendMode, opacity: f32, }, ChromaKey { color: String, similarity: f32, blend: f32, }, ColorKey { color: String, similarity: f32, blend: f32, }, SpillSuppress { key_color: String, strength: f32, }, AlphaMatte { matte: Box<FilterGraphBuilder>, }, LumaKey { threshold: f32, tolerance: f32, softness: f32, invert: bool, }, RectMask { x: u32, y: u32, width: u32, height: u32, invert: bool, }, FeatherMask { radius: u32, }, PolygonMatte { vertices: Vec<(f32, f32)>, invert: bool, },
}
Expand description

A single step in a filter chain.

Used by crate::FilterGraphBuilder to build pipeline filter graphs, and by crate::AudioTrack::effects to attach per-track effects in a multi-track mix.

Variants§

§

Trim

Trim: keep only frames in [start, end) seconds.

Fields

§start: f64
§end: f64
§

Scale

Scale to a new resolution using the given resampling algorithm.

Fields

§width: u32
§height: u32
§algorithm: ScaleAlgorithm
§

Crop

Crop a rectangular region.

Fields

§width: u32
§height: u32
§

Overlay

Overlay a second stream at position (x, y).

Fields

§

FadeIn

Fade-in from black starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

FadeOut

Fade-out to black starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

AFadeIn

Audio fade-in from silence starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

AFadeOut

Audio fade-out to silence starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

FadeInWhite

Fade-in from white starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

FadeOutWhite

Fade-out to white starting at start seconds, over duration seconds.

Fields

§start: f64
§duration: f64
§

Rotate

Rotate clockwise by angle_degrees, filling exposed areas with fill_color.

Fields

§angle_degrees: f64
§fill_color: String
§

ToneMap(ToneMap)

HDR-to-SDR tone mapping.

§

Volume(f64)

Adjust audio volume (in dB; negative = quieter).

§

Amix(usize)

Mix n audio inputs together.

§

ParametricEq

Multi-band parametric equalizer (low-shelf, high-shelf, or peak bands).

Each band maps to its own FFmpeg filter node chained in sequence. The bands vec must not be empty.

Fields

§bands: Vec<EqBand>
§

Lut3d

Apply a 3D LUT from a .cube or .3dl file.

Fields

§path: String
§

Eq

Brightness/contrast/saturation adjustment via FFmpeg eq filter.

Fields

§brightness: f32
§contrast: f32
§saturation: f32
§

EqAnimated

Brightness / contrast / saturation / gamma via FFmpeg eq filter (optionally animated).

Arguments are evaluated at Duration::ZERO for the initial graph build. Per-frame updates are applied via avfilter_graph_send_command in #363.

Fields

§brightness: AnimatedValue<f64>

Brightness offset. Range: −1.0 – 1.0 (neutral: 0.0).

§contrast: AnimatedValue<f64>

Contrast multiplier. Range: 0.0 – 3.0 (neutral: 1.0).

§saturation: AnimatedValue<f64>

Saturation multiplier. Range: 0.0 – 3.0 (neutral: 1.0; 0.0 = grayscale).

§gamma: AnimatedValue<f64>

Global gamma correction. Range: 0.1 – 10.0 (neutral: 1.0).

§

ColorBalanceAnimated

Three-way color balance (shadows / midtones / highlights) via FFmpeg colorbalance filter (optionally animated).

Each tuple is (R, G, B). Valid range per component: −1.0 – 1.0 (neutral: 0.0).

Arguments are evaluated at Duration::ZERO for the initial graph build. Per-frame updates are applied via avfilter_graph_send_command in #363.

Fields

§lift: AnimatedValue<(f64, f64, f64)>

Shadows (lift) correction per channel. FFmpeg params: "rs", "gs", "bs".

§gamma: AnimatedValue<(f64, f64, f64)>

Midtones (gamma) correction per channel. FFmpeg params: "rm", "gm", "bm".

§gain: AnimatedValue<(f64, f64, f64)>

Highlights (gain) correction per channel. FFmpeg params: "rh", "gh", "bh".

§

Curves

Per-channel RGB color curves adjustment.

Fields

§master: Vec<(f32, f32)>
§r: Vec<(f32, f32)>
§g: Vec<(f32, f32)>
§b: Vec<(f32, f32)>
§

WhiteBalance

White balance correction via colorchannelmixer.

Fields

§temperature_k: u32
§tint: f32
§

Hue

Hue rotation by an arbitrary angle.

Fields

§degrees: f32
§

Gamma

Per-channel gamma correction via FFmpeg eq filter.

Fields

§

ThreeWayCC

Three-way colour corrector (lift / gamma / gain) via FFmpeg curves filter.

Fields

§lift: Rgb

Affects shadows (blacks). Neutral: Rgb::NEUTRAL.

§gamma: Rgb

Affects midtones. Neutral: Rgb::NEUTRAL. All components must be > 0.0.

§gain: Rgb

Affects highlights (whites). Neutral: Rgb::NEUTRAL.

§

Vignette

Vignette effect via FFmpeg vignette filter.

Fields

§angle: f32

Radius angle in radians (valid range: 0.0 – π/2 ≈ 1.5708). Default: π/5 ≈ 0.628.

§x0: f32

Horizontal centre of the vignette. 0.0 maps to w/2.

§y0: f32

Vertical centre of the vignette. 0.0 maps to h/2.

§

HFlip

Horizontal flip (mirror left-right).

§

VFlip

Vertical flip (mirror top-bottom).

§

Reverse

Reverse video playback (buffers entire clip in memory — use only on short clips).

§

AReverse

Reverse audio playback (buffers entire clip in memory — use only on short clips).

§

Pad

Pad to a target resolution with a fill color (letterbox / pillarbox).

Fields

§width: u32

Target canvas width in pixels.

§height: u32

Target canvas height in pixels.

§x: i32

Horizontal offset of the source frame within the canvas. Negative values are replaced with (ow-iw)/2 (centred).

§y: i32

Vertical offset of the source frame within the canvas. Negative values are replaced with (oh-ih)/2 (centred).

§color: String

Fill color (any FFmpeg color string, e.g. "black", "0x000000").

§

FitToAspect

Scale (preserving aspect ratio) then centre-pad to fill target dimensions (letterbox or pillarbox as required).

Implemented as a scale filter with force_original_aspect_ratio=decrease followed by a pad filter that centres the scaled frame on the canvas.

Fields

§width: u32

Target canvas width in pixels.

§height: u32

Target canvas height in pixels.

§color: String

Fill color for the bars (any FFmpeg color string, e.g. "black").

§

GBlur

Gaussian blur with configurable radius.

sigma is the blur radius. Valid range: 0.0 – 10.0 (values near 0.0 are nearly a no-op; higher values produce a stronger blur).

Fields

§sigma: f32

Blur radius (standard deviation). Must be ≥ 0.0.

§

CropAnimated

Crop with optionally animated boundaries (pixels, f64 for sub-pixel precision).

Arguments are evaluated at Duration::ZERO for the initial graph build. Per-frame updates are applied via avfilter_graph_send_command in #363.

Fields

§x: AnimatedValue<f64>

X offset of the top-left corner, in pixels.

§y: AnimatedValue<f64>

Y offset of the top-left corner, in pixels.

§width: AnimatedValue<f64>

Width of the cropped region. Must evaluate to > 0 at Duration::ZERO.

§height: AnimatedValue<f64>

Height of the cropped region. Must evaluate to > 0 at Duration::ZERO.

§

GBlurAnimated

Gaussian blur with an optionally animated sigma (blur radius).

Arguments are evaluated at Duration::ZERO for the initial graph build. Per-frame updates are applied via avfilter_graph_send_command in #363.

Fields

§sigma: AnimatedValue<f64>

Blur radius (standard deviation). Must evaluate to ≥ 0.0 at Duration::ZERO.

§

Unsharp

Sharpen or blur via unsharp mask (luma + chroma strength).

Positive values sharpen; negative values blur. Valid range for each component: −1.5 – 1.5.

Fields

§luma_strength: f32

Luma (brightness) sharpening/blurring amount. Range: −1.5 – 1.5.

§chroma_strength: f32

Chroma (colour) sharpening/blurring amount. Range: −1.5 – 1.5.

§

Hqdn3d

High Quality 3D noise reduction (hqdn3d).

Typical values: luma_spatial=4.0, chroma_spatial=3.0, luma_tmp=6.0, chroma_tmp=4.5. All values must be ≥ 0.0.

Fields

§luma_spatial: f32

Spatial luma noise reduction strength. Must be ≥ 0.0.

§chroma_spatial: f32

Spatial chroma noise reduction strength. Must be ≥ 0.0.

§luma_tmp: f32

Temporal luma noise reduction strength. Must be ≥ 0.0.

§chroma_tmp: f32

Temporal chroma noise reduction strength. Must be ≥ 0.0.

§

Nlmeans

Non-local means noise reduction (nlmeans).

strength controls the denoising intensity; range 1.0–30.0. Higher values remove more noise but are significantly more CPU-intensive.

NOTE: nlmeans is CPU-intensive; avoid for real-time pipelines.

Fields

§strength: f32

Denoising strength. Must be in the range [1.0, 30.0].

§

Yadif

Deinterlace using the yadif filter.

Fields

§mode: YadifMode

Deinterlacing mode controlling output frame rate and spatial checks.

§

XFade

Cross-dissolve transition between two video streams (xfade).

Requires two input slots: slot 0 is clip A, slot 1 is clip B. duration is the overlap length in seconds; offset is the PTS offset (in seconds) at which clip B begins.

Fields

§transition: XfadeTransition

Transition style.

§duration: f64

Overlap duration in seconds. Must be > 0.0.

§offset: f64

PTS offset (seconds) where clip B starts.

§

DrawText

Draw text onto the video using the drawtext filter.

Fields

§opts: DrawTextOptions

Full set of drawtext parameters.

§

SubtitlesSrt

Burn-in SRT subtitles (hard subtitles) using the subtitles filter.

Fields

§path: String

Absolute or relative path to the .srt file.

§

SubtitlesAss

Burn-in ASS/SSA styled subtitles using the ass filter.

Fields

§path: String

Absolute or relative path to the .ass or .ssa file.

§

Speed

Playback speed change using setpts (video) and chained atempo (audio).

factor > 1.0 = fast motion; factor < 1.0 = slow motion. Valid range: 0.1–100.0.

Video path: setpts=PTS/{factor}. Audio path: the atempo filter only accepts [0.5, 2.0] per instance; filter_inner chains multiple instances to cover the full range.

Fields

§factor: f64

Speed multiplier. Must be in [0.1, 100.0].

§

LoudnessNormalize

EBU R128 two-pass loudness normalization.

Pass 1 measures integrated loudness with ebur128=peak=true:metadata=1. Pass 2 applies a linear volume correction so the output reaches target_lufs. All audio frames are buffered in memory between the two passes — use only for clips that fit comfortably in RAM.

Fields

§target_lufs: f32

Target integrated loudness in LUFS (e.g. −23.0). Must be < 0.0.

§true_peak_db: f32

True-peak ceiling in dBTP (e.g. −1.0). Must be ≤ 0.0.

§lra: f32

Target loudness range in LU (e.g. 7.0). Must be > 0.0.

§

NormalizePeak

Peak-level two-pass normalization using astats.

Pass 1 measures the true peak with astats=metadata=1. Pass 2 applies volume={gain}dB so the output peak reaches target_db. All audio frames are buffered in memory between passes — use only for clips that fit comfortably in RAM.

Fields

§target_db: f32

Target peak level in dBFS (e.g. −1.0). Must be ≤ 0.0.

§

ANoiseGate

Noise gate via FFmpeg’s agate filter.

Audio below threshold_db is attenuated; audio above passes through. The threshold is converted from dBFS to the linear scale expected by agate’s threshold parameter (linear = 10^(dB/20)).

Fields

§threshold_db: f32

Gate open/close threshold in dBFS (e.g. −40.0).

§attack_ms: f32

Attack time in milliseconds — how quickly the gate opens. Must be > 0.0.

§release_ms: f32

Release time in milliseconds — how quickly the gate closes. Must be > 0.0.

§

ACompressor

Dynamic range compressor via FFmpeg’s acompressor filter.

Reduces the dynamic range of the audio signal: peaks above threshold_db are attenuated by ratio:1. makeup_db applies additional gain after compression to restore perceived loudness.

Fields

§threshold_db: f32

Compression threshold in dBFS (e.g. −20.0).

§ratio: f32

Compression ratio (e.g. 4.0 = 4:1). Must be ≥ 1.0.

§attack_ms: f32

Attack time in milliseconds. Must be > 0.0.

§release_ms: f32

Release time in milliseconds. Must be > 0.0.

§makeup_db: f32

Make-up gain in dB applied after compression (e.g. 6.0).

§

StereoToMono

Downmix stereo to mono via FFmpeg’s pan filter.

Both channels are mixed with equal weight: mono|c0=0.5*c0+0.5*c1. The output has a single channel.

§

ChannelMap

Remap audio channels using FFmpeg’s channelmap filter.

mapping is a |-separated list of output channel names taken from input channels, e.g. "FR|FL" swaps left and right. Must not be empty.

Fields

§mapping: String

FFmpeg channelmap mapping expression (e.g. "FR|FL").

§

AudioDelay

A/V sync correction via audio delay or advance.

Positive ms: uses FFmpeg’s adelay filter to shift audio later. Negative ms: uses FFmpeg’s atrim filter to trim the audio start, effectively advancing audio by |ms| milliseconds. Zero ms: uses adelay with zero delay (no-op).

Fields

§ms: f64

Delay in milliseconds. Positive = delay; negative = advance.

§

ConcatVideo

Concatenate n sequential video input segments via FFmpeg’s concat filter.

Requires n video input slots (0 through n-1). n must be ≥ 2.

Fields

§n: u32

Number of video input segments to concatenate. Must be ≥ 2.

§

ConcatAudio

Concatenate n sequential audio input segments via FFmpeg’s concat filter.

Requires n audio input slots (0 through n-1). n must be ≥ 2.

Fields

§n: u32

Number of audio input segments to concatenate. Must be ≥ 2.

§

FreezeFrame

Freeze a single frame for a configurable duration using FFmpeg’s loop filter.

The frame nearest to pts seconds is held for duration seconds, then playback resumes. Frame numbers are approximated using a 25 fps assumption; accuracy depends on the source stream’s actual frame rate.

Fields

§pts: f64

Timestamp of the frame to freeze, in seconds. Must be >= 0.0.

§duration: f64

Duration to hold the frozen frame, in seconds. Must be > 0.0.

§

Ticker

Scrolling text ticker (right-to-left) using the drawtext filter.

The text starts off-screen to the right and scrolls left at speed_px_per_sec pixels per second using the expression x = w - t * speed.

Fields

§text: String

Text to display. Special characters (\, :, ') are escaped.

§y: String

Y position as an FFmpeg expression, e.g. "h-50" or "10".

§speed_px_per_sec: f32

Horizontal scroll speed in pixels per second (must be > 0.0).

§font_size: u32

Font size in points.

§font_color: String

Font color as an FFmpeg color string, e.g. "white" or "0xFFFFFF".

§

JoinWithDissolve

Join two video clips with a cross-dissolve transition.

Compound step — expands in filter_inner to:

in0 → trim(end=clip_a_end+dissolve_dur) → setpts → xfade[0]
in1 → trim(start=max(0, clip_b_start−dissolve_dur)) → setpts → xfade[1]

Requires two video input slots: slot 0 = clip A, slot 1 = clip B. clip_a_end and dissolve_dur must be > 0.0.

Fields

§clip_a_end: f64

Timestamp (seconds) where clip A ends. Must be > 0.0.

§clip_b_start: f64

Timestamp (seconds) where clip B content starts (before the overlap).

§dissolve_dur: f64

Cross-dissolve overlap duration in seconds. Must be > 0.0.

§

OverlayImage

Composite a PNG image (watermark / logo) over video with optional opacity.

This is a compound step: internally it creates a movie source, a lut alpha-scaling filter, and an overlay compositing filter. The image file is loaded once at graph construction time.

Fields

§path: String

Absolute or relative path to the .png file.

§x: String

Horizontal position as an FFmpeg expression, e.g. "10" or "W-w-10".

§y: String

Vertical position as an FFmpeg expression, e.g. "10" or "H-h-10".

§opacity: f32

Opacity 0.0 (fully transparent) to 1.0 (fully opaque).

§

Blend

Blend a top layer over the current stream (bottom) using the given mode.

This is a compound step:

The top builder’s steps are applied to the second input slot (in1). opacity is clamped to [0.0, 1.0] by the builder method.

Box<FilterGraphBuilder> is used to break the otherwise-recursive type: FilterStepFilterGraphBuilderVec<FilterStep>.

Fields

§top: Box<FilterGraphBuilder>

Filter pipeline for the top (foreground) layer.

§mode: BlendMode

How the two layers are combined.

§opacity: f32

Opacity of the top layer in [0.0, 1.0]; 1.0 = fully opaque.

§

ChromaKey

Remove pixels matching color using FFmpeg’s chromakey filter, producing a yuva420p output with transparent areas where the key color was detected.

Use this for YCbCr-encoded sources (most video). For RGB sources use colorkey instead.

Fields

§color: String

FFmpeg color string, e.g. "green", "0x00FF00", "#00FF00".

§similarity: f32

Match radius in [0.0, 1.0]; higher = more pixels removed.

§blend: f32

Edge softness in [0.0, 1.0]; 0.0 = hard edge.

§

ColorKey

Remove pixels matching color in RGB space using FFmpeg’s colorkey filter, producing an rgba output with transparent areas where the key color was detected.

Use this for RGB-encoded sources. For YCbCr-encoded video (most video) use chromakey instead.

Fields

§color: String

FFmpeg color string, e.g. "green", "0x00FF00", "#00FF00".

§similarity: f32

Match radius in [0.0, 1.0]; higher = more pixels removed.

§blend: f32

Edge softness in [0.0, 1.0]; 0.0 = hard edge.

§

SpillSuppress

Reduce color spill from the key color on subject edges using FFmpeg’s hue filter to desaturate the spill hue region.

Applies hue=h=0:s=(1.0 - strength). strength=0.0 leaves the image unchanged; strength=1.0 fully desaturates.

key_color is stored for future use by a more targeted per-hue implementation.

Fields

§key_color: String

FFmpeg color string identifying the spill color, e.g. "green".

§strength: f32

Suppression intensity in [0.0, 1.0]; 0.0 = no effect, 1.0 = full suppression.

§

AlphaMatte

Merge a grayscale matte as the alpha channel of the input video using FFmpeg’s alphamerge filter.

White (luma=255) in the matte produces fully opaque output; black (luma=0) produces fully transparent output.

This is a compound step: the matte builder’s pipeline is applied to the second input slot (in1) before the alphamerge filter is linked.

Box<FilterGraphBuilder> breaks the otherwise-recursive type, following the same pattern as FilterStep::Blend.

Fields

§matte: Box<FilterGraphBuilder>

Pipeline for the grayscale matte stream (slot 1).

§

LumaKey

Key out pixels by luminance value using FFmpeg’s lumakey filter.

Pixels whose normalized luma is within tolerance of threshold are made transparent. When invert is true, a geq filter is appended to negate the alpha channel, effectively swapping transparent and opaque regions.

  • threshold: luma cutoff in [0.0, 1.0]; 0.0 = black, 1.0 = white.
  • tolerance: match radius around the threshold in [0.0, 1.0].
  • softness: edge feather width in [0.0, 1.0]; 0.0 = hard edge.
  • invert: when false, keys out bright regions (pixels matching the threshold); when true, the alpha is negated after keying, making the complementary region transparent instead.

Output carries an alpha channel (yuva420p).

Fields

§threshold: f32

Luma cutoff in [0.0, 1.0].

§tolerance: f32

Match radius around the threshold in [0.0, 1.0].

§softness: f32

Edge feather width in [0.0, 1.0]; 0.0 = hard edge.

§invert: bool

When true, the alpha channel is negated after keying.

§

RectMask

Apply a rectangular alpha mask using FFmpeg’s geq filter.

Pixels inside the rectangle defined by (x, y, width, height) are made fully opaque (alpha=255); pixels outside are made fully transparent (alpha=0). When invert is true the roles are swapped: inside becomes transparent and outside becomes opaque.

  • x, y: top-left corner of the rectangle (in pixels).
  • width, height: rectangle dimensions (must be > 0).
  • invert: when false, keeps the interior; when true, keeps the exterior.

width and height are validated in build; zero values return crate::FilterError::InvalidConfig.

The output carries an alpha channel (rgba).

Fields

§x: u32

Left edge of the rectangle (pixels from the left).

§y: u32

Top edge of the rectangle (pixels from the top).

§width: u32

Width of the rectangle in pixels (must be > 0).

§height: u32

Height of the rectangle in pixels (must be > 0).

§invert: bool

When true, the mask is inverted: outside is opaque, inside is transparent.

§

FeatherMask

Feather (soften) the alpha channel edges using a Gaussian blur.

Splits the stream into a color copy and an alpha copy, blurs the alpha plane with gblur=sigma=<radius>, then re-merges:

[in]split=2[color][with_alpha];
[with_alpha]alphaextract[alpha_only];
[alpha_only]gblur=sigma=<radius>[alpha_blurred];
[color][alpha_blurred]alphamerge[out]

radius is the blur kernel half-size in pixels and must be > 0. Validated in build; radius == 0 returns crate::FilterError::InvalidConfig.

Typically chained after a keying or masking step (e.g. FilterStep::ChromaKey, FilterStep::RectMask, FilterStep::PolygonMatte). Applying this step to a fully-opaque video (no prior alpha) is a no-op because a uniform alpha of 255 blurs to 255 everywhere.

Fields

§radius: u32

Gaussian blur kernel half-size in pixels (must be > 0).

§

PolygonMatte

Apply a polygon alpha mask using FFmpeg’s geq filter with a crossing-number point-in-polygon test.

Pixels inside the polygon are fully opaque (alpha=255); pixels outside are fully transparent (alpha=0). When invert is true the roles are swapped.

  • vertices: polygon corners as (x, y) in [0.0, 1.0] (normalised to frame size). Minimum 3, maximum 16.
  • invert: when false, inside = opaque; when true, outside = opaque.

Vertex count and coordinates are validated in build; out-of-range values return crate::FilterError::InvalidConfig.

The geq expression is constructed from the vertex list at graph build time. Degenerate polygons (zero area) produce a fully-transparent mask. The output carries an alpha channel (rgba).

Fields

§vertices: Vec<(f32, f32)>

Polygon corners in normalised [0.0, 1.0] frame coordinates.

§invert: bool

When true, the mask is inverted: outside is opaque, inside is transparent.

Trait Implementations§

Source§

impl Clone for FilterStep

Source§

fn clone(&self) -> FilterStep

Returns a duplicate of the value. Read more
1.0.0 · Source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
Source§

impl Debug for FilterStep

Source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more

Auto Trait Implementations§

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> CloneToUninit for T
where T: Clone,

Source§

unsafe fn clone_to_uninit(&self, dest: *mut u8)

🔬This is a nightly-only experimental API. (clone_to_uninit)
Performs copy-assignment from self to dest. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T> ToOwned for T
where T: Clone,

Source§

type Owned = T

The resulting type after obtaining ownership.
Source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
Source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.