Expand description

A high-level API for processing and synthesizing audio.

Example

use std::fs::File;
use web_audio_api::context::{BaseAudioContext, AudioContext};
use web_audio_api::node::{AudioNode, AudioScheduledSourceNode};

// set up AudioContext with optimized settings for your hardware
let context = AudioContext::default();

// create an audio buffer from a given file
let file = File::open("samples/sample.wav").unwrap();
let buffer = context.decode_audio_data_sync(file).unwrap();

// play the buffer at given volume
let volume = context.create_gain();
volume.connect(&context.destination());
volume.gain().set_value(0.5);

let buffer_source = context.create_buffer_source();
buffer_source.connect(&volume);
buffer_source.set_buffer(buffer);

// create oscillator branch
let osc = context.create_oscillator();
osc.connect(&context.destination());

// start the sources
buffer_source.start();
osc.start();

// enjoy listening
std::thread::sleep(std::time::Duration::from_secs(4));

Modules

The BaseAudioContext interface and the AudioContext and OfflineAudioContext types

Convenience abstractions that are not part of the WebAudio API (media decoding, microphone)

The AudioNode interface and concrete types

Primitives related to audio graph rendering

Structs

Memory-resident audio asset, basically a matrix of channels * samples

Options for constructing an AudioBuffer

Represents the position and orientation of the person listening to the audio scene

AudioParam controls an individual aspect of an AudioNode’s functionality, such as volume.

Options for constructing an AudioParam

PeriodicWave represents an arbitrary periodic waveform to be used with an OscillatorNode.

Options for constructing a PeriodicWave

Enums

Precision of AudioParam value calculation per render quantum

Constants

Maximum number of channels for audio processing

Render quantum size, the audio graph is rendered in blocks of RENDER_QUANTUM_SIZE samples see. https://webaudio.github.io/web-audio-api/#render-quantum