1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
//! The **core** module provides the foundational building blocks for configuring and running FFmpeg
//! pipelines. It encompasses:
//!
//! - **Input & Output Handling** (in [`context`]): Structures and logic (`Input`, `Output`) for
//! specifying where media data originates and where it should be written.
//! - **Filter Descriptions**: Define filter graphs with `FilterComplex` or attach custom [`FrameFilter`](filter::frame_filter::FrameFilter)
//! implementations at the input/output stage.
//! - **Stream and Device Queries** (in [`stream_info`] and [`device`]): Utilities for retrieving
//! information about media streams and available input devices.
//! - **Hardware Acceleration** (in [`hwaccel`]): Enumerate/configure GPU-accelerated codecs (CUDA, VAAPI, etc.).
//! - **Codec Discovery** (in [`codec`]): List encoders/decoders supported by FFmpeg.
//! - **Custom Filters** (in [`filter`]): Implement user-defined [`FrameFilter`](filter::frame_filter::FrameFilter) logic for frames.
//! - **Lifecycle Orchestration** (in [`scheduler`]): [`FfmpegScheduler`](scheduler::ffmpeg_scheduler::FfmpegScheduler) that runs the configured pipeline
//! (synchronously or asynchronously if the `async` feature is enabled).
//!
//! # Submodules
//!
//! - [`context`]: Houses [`FfmpegContext`](context::ffmpeg_context::FfmpegContext)—the central struct for assembling inputs, outputs, and filters.
//! - [`scheduler`]: Defines [`FfmpegScheduler`](scheduler::ffmpeg_scheduler::FfmpegScheduler), managing the execution of an `FfmpegContext` pipeline.
//! - [`container_info`]: Utilities to extract information about the container, such as duration and format details.
//! - [`stream_info`]: Inspect media streams (e.g., find video/audio streams in a file).
//! - [`device`]: Query audio/video input devices (cameras, microphones, etc.) on various platforms.
//! - [`hwaccel`]: Helpers for hardware-accelerated encoding/decoding setup.
//! - [`codec`]: Tools to discover which encoders/decoders your FFmpeg build supports.
//! - [`filter`]: Query FFmpeg's built-in filters and infrastructure for building custom frame-processing filters.
//!
//! # Example Workflow
//!
//! 1. **Build a context** using [`FfmpegContext::builder()`](crate::core::context::ffmpeg_context::FfmpegContext::builder)
//! specifying your input, any filters, and your output.
//! 2. **Create a scheduler** with [`FfmpegScheduler::new`](crate::core::scheduler::ffmpeg_scheduler::FfmpegScheduler::new),
//! then call `.start()` to begin processing.
//! 3. **Wait** (or `.await` if `async` feature is enabled) for the job to complete. Use the returned
//! `Result` to detect success or failure.
//!
//! # Example
//! ```rust,ignore
//!
//! fn main() -> Result<(), Box<dyn std::error::Error>> {
//! // 1. Build an FfmpegContext with an input, a simple filter, and an output
//! let context = FfmpegContext::builder()
//! .input("test.mp4")
//! .filter_desc("hue=s=0") // Example: desaturate video
//! .output("output.mp4")
//! .build()?;
//!
//! // 2. Create a scheduler and start the job
//! let scheduler = FfmpegScheduler::new(context).start()?;
//!
//! // 3. Block until it's finished
//! scheduler.wait()?;
//! Ok(())
//! }
//! ```
/// The **context** module provides tools for assembling an entire FFmpeg pipeline,
/// culminating in the [`FfmpegContext`](context::ffmpeg_context::FfmpegContext). This includes:
///
/// - **Inputs**: [`Input`](context::input::Input) objects representing files, URLs, or custom I/O callbacks.
/// - **Outputs**: [`Output`](context::output::Output) objects representing target files, streams, or custom sinks.
/// - **Filter Descriptions**: Simple inline filters via `filter_desc` or more complex
/// [`FilterComplex`](context::filter_complex::FilterComplex) graphs.
/// - **Builders**: e.g., [`FfmpegContextBuilder`](context::ffmpeg_context_builder::FfmpegContextBuilder) for constructing a complete context
/// with multiple inputs, outputs, and filter settings.
///
/// Once you’ve built an [`FfmpegContext`](context::ffmpeg_context::FfmpegContext), you can execute it via the [`FfmpegScheduler`](scheduler::ffmpeg_scheduler::FfmpegScheduler).
///
/// # Example
///
/// ```rust,ignore
/// // Build an FFmpeg context with one input, some filter settings, and one output.
/// let context = FfmpegContext::builder()
/// .input("test.mp4")
/// .filter_desc("hue=s=0")
/// .output("output.mp4")
/// .build()
/// .unwrap();
/// // The context now holds all info needed for an FFmpeg job.
/// ```
/// The **scheduler** module orchestrates the execution of a configured [`FfmpegContext`](context::ffmpeg_context::FfmpegContext).
/// It provides the [`FfmpegScheduler`](scheduler::ffmpeg_scheduler::FfmpegScheduler) struct, which:
///
/// - **Starts** the FFmpeg pipeline via [`FfmpegScheduler::start()`](scheduler::ffmpeg_scheduler::FfmpegScheduler<crate::core::scheduler::ffmpeg_scheduler::Initialization>::start()).
/// - **Manages** thread or subprocess creation, ensuring all streams and filters run.
/// - **Waits** for completion (blocking or asynchronous, depending on whether the `async` feature is enabled).
/// - **Returns** the final result, indicating success or failure.
///
/// # Synchronous Example
///
/// ```rust,ignore
/// let context = FfmpegContext::builder()
/// .input("test.mp4")
/// .filter_desc("hue=s=0")
/// .output("output.mp4")
/// .build()
/// .unwrap();
///
/// let result = FfmpegScheduler::new(context)
/// .start()
/// .unwrap()
/// .wait();
///
/// assert!(result.is_ok(), "FFmpeg job failed unexpectedly");
/// ```
///
/// # Asynchronous Example (requires `async` feature)
///
/// ```rust,ignore
/// #[tokio::main]
/// async fn main() {
/// let context = FfmpegContext::builder()
/// .input("test.mp4")
/// .output("output.mp4")
/// .build()
/// .unwrap();
///
/// let mut scheduler = FfmpegScheduler::new(context)
/// .start()
/// .expect("Failed to start FFmpeg job");
///
/// // Asynchronous wait
/// scheduler.await.expect("FFmpeg job failed unexpectedly");
/// }
/// ```
/// The **container_info** module provides utilities for retrieving metadata related to the media container,
/// such as duration, format, and other general properties of the media file.
///
/// This module helps to query the overall properties of a media container file (e.g., `.mp4`, `.avi`, `.mkv`)
/// without diving into individual streams (audio, video, etc.). It is useful when you need information
/// about the file as a whole, such as total duration, format type, and container-specific properties.
///
/// # Examples
///
/// ```rust,ignore
/// // Retrieve the duration in microseconds for the media file "test.mp4"
/// let duration = get_duration_us("test.mp4").unwrap();
/// println!("Duration: {} us", duration);
///
/// // Retrieve the format name for "test.mp4"
/// let format = get_format("test.mp4").unwrap();
/// println!("Format: {}", format);
///
/// // Retrieve the metadata for "test.mp4"
/// let metadata = get_metadata("test.mp4").unwrap();
/// for (key, value) in metadata {
/// println!("{}: {}", key, value);
/// }
/// ```
///
/// These helper functions return the container-level metadata, and they handle any errors that may arise
/// (e.g., if the file can't be opened or if there is an issue reading the data).
/// The **stream_info** module provides utilities to retrieve detailed information
/// about media streams (video, audio, and more) from an input source (e.g., a local file
/// path, an RTMP URL, etc.). It queries FFmpeg for metadata regarding stream types, codec
/// parameters, duration, and other relevant details.
///
/// # Examples
///
/// ```rust,ignore
/// // Retrieve information about the first video stream in "test.mp4"
/// let maybe_video_info = find_video_stream_info("test.mp4").unwrap();
/// if let Some(video_info) = maybe_video_info {
/// println!("Found video stream: {:?}", video_info);
/// } else {
/// println!("No video stream found.");
/// }
///
/// // Retrieve information about the first audio stream in "test.mp4"
/// let maybe_audio_info = find_audio_stream_info("test.mp4").unwrap();
/// if let Some(audio_info) = maybe_audio_info {
/// println!("Found audio stream: {:?}", audio_info);
/// } else {
/// println!("No audio stream found.");
/// }
///
/// // Retrieve information about all streams (video, audio, etc.) in "test.mp4"
/// let all_infos = find_all_stream_infos("test.mp4").unwrap();
/// println!("Total streams found: {}", all_infos.len());
/// for info in all_infos {
/// println!("{:?}", info);
/// }
/// ```
///
/// These helper functions return `Result<Option<StreamInfo>, Error>` or `Result<Vec<StreamInfo>, Error>`
/// depending on the call, allowing you to differentiate between "no stream found" (returns `Ok(None)`)
/// and encountering an actual error (returns `Err(...)`).
/// The **packet_scanner** module provides a lightweight packet-level scanner for media files.
///
/// Unlike the full demuxing pipeline, `PacketScanner` iterates over raw demuxed packets
/// without any decoding. This is useful for inspecting packet metadata such as timestamps,
/// keyframe flags, sizes, and stream indices.
///
/// # Examples
///
/// ```rust,ignore
/// use ez_ffmpeg::packet_scanner::PacketScanner;
///
/// let mut scanner = PacketScanner::open("test.mp4")?;
/// for packet in scanner.packets() {
/// let packet = packet?;
/// println!(
/// "stream={} pts={:?} size={} keyframe={}",
/// packet.stream_index(),
/// packet.pts(),
/// packet.size(),
/// packet.is_keyframe(),
/// );
/// }
/// ```
/// The **device** module provides cross-platform methods to query available audio and video
/// input devices on the system. Depending on the target operating system, it internally
/// delegates to different platform APIs or FFmpeg’s device capabilities:
///
/// - **macOS**: Leverages AVFoundation for enumerating devices such as cameras ("vide")
/// and microphones ("soun").
/// - **Other OSes**: Uses FFmpeg’s `avdevice` to list input devices for video and audio.
///
/// These functions can be used to programmatically discover devices before choosing one
/// for capture or recording in an FFmpeg-based pipeline.
///
/// # Examples
///
/// ```rust,ignore
/// // Query video input devices (e.g., cameras)
/// let video_devices = get_input_video_devices().unwrap();
/// for device in &video_devices {
/// println!("Available video device: {}", device);
/// }
///
/// // Query audio input devices (e.g., microphones)
/// let audio_devices = get_input_audio_devices().unwrap();
/// for device in &audio_devices {
/// println!("Available audio device: {}", device);
/// }
/// ```
///
/// # Notes
///
/// - If the query process fails (e.g., missing permissions or no devices available),
/// the functions return an appropriate error from `crate::error`.
/// - On macOS, the `AVFoundation` framework is used directly. On other platforms, FFmpeg’s
/// `avdevice` functionality is used. Implementation details differ, but the returned
/// results have a uniform format: a list of human-readable device names.
/// - For more advanced device details (e.g., supported formats or resolutions), you may need
/// to perform additional FFmpeg queries or platform-specific calls.
/// The **hwaccel** module provides functionality for working with hardware-accelerated
/// codecs in FFmpeg. It allows you to detect and configure various hardware devices
/// (like NVENC, VAAPI, DXVA2, or VideoToolbox) so that FFmpeg can offload encoding or
/// decoding tasks to GPU or specialized hardware.
///
/// # Public API
///
/// - [`get_hwaccels()`](hwaccel::get_hwaccels): Enumerates the hardware acceleration backends available on the
/// current system, returning a list of [`HWAccelInfo`](hwaccel::HWAccelInfo) items. Each item contains a
/// readable name (e.g., `"cuda"`, `"vaapi"`) and the corresponding `AVHWDeviceType`.
///
/// # Example
///
/// ```rust,ignore
/// // Query hardware acceleration backends
/// let hwaccels = get_hwaccels();
/// for accel in hwaccels {
/// println!("Found HW Accel: {} (type: {:?})", accel.name, accel.hw_device_type);
/// }
/// ```
///
/// # Notes
///
/// - While only [`get_hwaccels()`](hwaccel::get_hwaccels) is directly exposed, internally the module contains
/// various helpers to initialize and manage hardware devices (e.g., `hw_device_init_from_string`).
/// These are used behind the scenes or in more advanced scenarios where explicit control
/// over device creation is required.
/// - Hardware acceleration support depends on both FFmpeg’s compilation configuration
/// and the underlying system drivers/frameworks. Not all listed accelerations may be
/// fully functional on every platform.
/// The **codec** module provides helpers for enumerating and querying FFmpeg’s
/// available audio/video **encoders** and **decoders**. This can be useful for
/// discovering which codecs are supported in your current FFmpeg build, along
/// with their core attributes.
///
/// # Public API
///
/// - [`get_encoders()`](codec::get_encoders): Returns a list of [`CodecInfo`](codec::CodecInfo) representing all
/// encoders (e.g., H.264, AAC) recognized by FFmpeg.
/// - [`get_decoders()`](codec::get_decoders): Returns a list of [`CodecInfo`](codec::CodecInfo) representing all
/// decoders (e.g., H.264, AAC) recognized by FFmpeg.
///
/// # Example
///
/// ```rust,ignore
/// // List all available encoders
/// let encoders = get_encoders();
/// for enc in &encoders {
/// println!("Encoder: {} - {}", enc.codec_name, enc.codec_long_name);
/// }
///
/// // List all available decoders
/// let decoders = get_decoders();
/// for dec in &decoders {
/// println!("Decoder: {} - {}", dec.codec_name, dec.codec_long_name);
/// }
/// ```
///
/// # Data Structures
///
/// - [`CodecInfo`](codec::CodecInfo): Contains user-friendly fields such as:
/// - `codec_name` / `codec_long_name`
/// - `desc_name`: The descriptor name from FFmpeg.
/// - `media_type` (audio/video/subtitle, etc.)
/// - `codec_id` (internal FFmpeg ID)
/// - `codec_capabilities` (bitmask indicating codec features)
///
/// # Notes
///
/// - The underlying [`Codec`] struct is for internal usage only, bridging to
/// the raw FFmpeg APIs. In most cases, you only need the higher-level [`CodecInfo`](codec::CodecInfo)
/// data from the public functions above.
/// - The available encoders/decoders can vary depending on your FFmpeg build
/// and any external libraries installed on the system.
/// The **filter** module provides a flexible framework for custom frame processing
/// within the FFmpeg pipeline, along with the ability to query FFmpeg's built-in filters.
/// It introduces the [`FrameFilter`](filter::frame_filter::FrameFilter) trait, which defines how to apply transformations
/// (e.g., scaling, color adjustments, GPU-accelerated effects) to decoded frames.
/// You can attach these filters to either the input or the output side
/// (depending on your desired pipeline design) so that frames are automatically
/// processed in your FFmpeg workflow.
///
/// # FFmpeg Built-in Filters
///
/// ```rust,ignore
/// use ez_ffmpeg::core::filter::get_filters;
///
/// // Query available FFmpeg filters
/// let filters = get_filters();
/// for filter in filters {
/// println!("Filter: {} - {}", filter.name, filter.description);
/// }
/// ```
///
/// # Defining and Using a Custom Filter
///
/// Below is a minimal example showing how to implement a custom filter and attach it to
/// an `Output` so that every frame is processed before encoding. You could likewise
/// attach it to an `Input` if you want the frames processed immediately after decoding.
///
/// ```rust,ignore
///
/// // 1. Define your custom filter by implementing the FrameFilter trait.
/// struct FlipFilter;
///
/// impl FrameFilter for FlipFilter {
/// fn media_type(&self) -> AVMediaType {
/// // This filter operates on video frames.
/// AVMediaType::AVMEDIA_TYPE_VIDEO
/// }
///
/// fn filter_frame(
/// &mut self,
/// mut frame: Frame,
/// _ctx: &FrameFilterContext,
/// ) -> Result<Option<Frame>, String> {
/// unsafe {
/// if frame.as_ptr().is_null() || frame.is_empty() {
/// return Ok(Some(frame));
/// }
/// }
///
/// // Here you would implement the logic to transform the frame.
/// // As a trivial example, we just return the original frame.
/// // (Replace this with your actual transformation code.)
///
/// Ok(Some(frame))
/// }
/// }
///
/// fn main() -> Result<(), Box<dyn std::error::Error>> {
/// // 2. Create a pipeline builder for video frames.
/// let mut pipeline_builder: FramePipelineBuilder = AVMediaType::AVMEDIA_TYPE_VIDEO.into();
///
/// // 3. Add your custom filter to the pipeline, giving it a unique name.
/// pipeline_builder = pipeline_builder.filter("flip-filter", Box::new(FlipFilter));
///
/// // 4. Attach the pipeline to an Output (could also attach to an Input).
/// let mut output: Output = "output.mp4".into();
/// output.add_frame_pipeline(pipeline_builder);
///
/// // 5. Build the FFmpeg context with both input and output.
/// let context = FfmpegContext::builder()
/// .input("input.mp4")
/// .output(output)
/// .build()?;
///
/// // 6. Run the FFmpeg job via the scheduler.
/// FfmpegScheduler::new(context)
/// .start()?
/// .wait()?;
///
/// Ok(())
/// }
/// ```
///
/// In this example:
/// 1. We define a **`FlipFilter`** that implements the [`FrameFilter`](filter::frame_filter::FrameFilter) trait and specifies
/// `AVMediaType::AVMEDIA_TYPE_VIDEO`.
/// 2. We create a **`FramePipelineBuilder`** for `VIDEO` frames and add our filter to it.
/// 3. We attach that pipeline to the **`Output`** configuration, so frames will be processed
/// (in this case, “flipped”) before encoding.
/// 4. Finally, we build the FFmpeg context and run it with the **`FfmpegScheduler`**.
///
/// # More Advanced Filters
///
/// For a more complex, GPU-accelerated example, see the **OpenGL**-based filters in the
/// [`opengl` module](crate::opengl). There, you can use custom GLSL shaders to apply
/// sophisticated transformations or visual effects on video frames.
///
/// # Trait Overview
///
/// The [`FrameFilter`](filter::frame_filter::FrameFilter) trait exposes several methods you can override:
/// - [`FrameFilter::media_type()`](filter::frame_filter::FrameFilter::media_type): Indicates which media type (video, audio, etc.) this filter handles.
/// - [`FrameFilter::init()`](filter::frame_filter::FrameFilter::init): Called once when the filter is first created (e.g., allocate resources).
/// - [`FrameFilter::filter_frame()`](filter::frame_filter::FrameFilter::filter_frame): The primary method for transforming an incoming frame.
/// - [`FrameFilter::request_frame()`](filter::frame_filter::FrameFilter::request_frame): If your filter generates frames on its own, you can override this.
/// - [`FrameFilter::uninit()`](filter::frame_filter::FrameFilter::uninit): Called during cleanup when the filter is removed or the pipeline ends.
///
/// By chaining multiple filters in a pipeline, you can create sophisticated processing
/// chains for your media data.
/// The **metadata** module provides internal metadata handling for FFmpeg operations.
///
/// **Internal Use Only**: This module contains unsafe FFmpeg C API wrappers.
/// Users should use the safe public API on `Output` instead:
/// - `Output::add_metadata()` for global metadata
/// - `Output::add_stream_metadata()` for stream metadata
/// - `Output::map_metadata_from_input()` for metadata mapping
/// - `Output::disable_auto_copy_metadata()` for controlling auto-copy
///
/// # Example
/// ```rust,ignore
/// let output = Output::from("output.mp4")
/// .add_metadata("title", "My Video")
/// .add_metadata("author", "John Doe")
/// .add_stream_metadata("v:0", "language", "eng")?;
/// ```
pub
static INIT_FFMPEG: Once = new;
extern "C"
// The following type definitions for `VaListType` are inspired by the Rust standard library's
// implementation of `va_list` (see std::ffi::va_list::VaListImpl). These definitions ensure compatibility
// with platform-specific ABI requirements when interfacing with C variadic functions.
type VaListType = *mut c_char;
type VaListType = *mut __va_list_tag;
type VaListType = *mut c_void;
type VaListType = *mut __va_list_tag_powerpc;
type VaListType = *mut __va_list_tag_s390x;
unsafe extern "C"