Struct web_audio_api::context::OfflineAudioContext[][src]

pub struct OfflineAudioContext { /* fields omitted */ }
Expand description

The OfflineAudioContext doesn’t render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

Implementations

Trait Implementations

Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a tone. Read more

Creates an GainNode, to control audio volume

Creates an ConstantSourceNode, a source representing a constant value

Creates a DelayNode, delaying the audio signal

Creates an biquadFilterNode

Creates a ChannelSplitterNode

Creates a ChannelMergerNode

Creates a MediaStreamAudioSourceNode from a MediaElement

Creates a MediaElementAudioSourceNode from a MediaElement Read more

Creates an AudioBufferSourceNode Read more

Creates a PannerNode

Creates a AnalyserNode

Creates a periodic wave

Create an AudioParam. Read more

Returns an AudioDestinationNode representing the final destination of all audio in the context. It can be thought of as the audio-rendering device. Read more

Returns the AudioListener which is used for 3D spatialization

The sample rate (in sample-frames per second) at which the AudioContext handles audio.

This is the time in seconds of the sample frame immediately following the last sample-frame in the block of audio most recently processed by the context’s rendering graph. Read more

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Performs the conversion.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.