Struct web_audio_api::context::OfflineAudioContext [−][src]
pub struct OfflineAudioContext { /* fields omitted */ }
Expand description
The OfflineAudioContext
doesn’t render the audio to the device hardware; instead, it generates
it, as fast as it can, and outputs the result to an AudioBuffer
.
Implementations
Creates an OfflineAudioContext
instance
Arguments
channels
- number of output channels to renderlength
- length of the rendering audio buffersample_rate
- output sample rate
OfflineAudioContext
doesn’t start rendering automatically
You need to call this function to start the audio rendering
Trait Implementations
retrieves the BaseAudioContext
associated with the concrete AudioContext
Retrieves an AudioBuffer
from a given std::fs::File
Read more
fn create_buffer(
&self,
number_of_channels: usize,
length: usize,
sample_rate: SampleRate
) -> AudioBuffer
fn create_buffer(
&self,
number_of_channels: usize,
length: usize,
sample_rate: SampleRate
) -> AudioBuffer
Create an new “in-memory” AudioBuffer
with the given number of channels,
length (i.e. number of samples per channel) and sample rate. Read more
Creates an OscillatorNode
, a source representing a periodic waveform. It basically
generates a tone. Read more
Creates an StereoPannerNode
to pan a stereo output
Creates an GainNode
, to control audio volume
Creates an ConstantSourceNode
, a source representing a constant value
Creates an IirFilterNode
Read more
Creates a DelayNode
, delaying the audio signal
Creates an BiquadFilterNode
which implements a second order filter
Creates a WaveShaperNode
Creates a ChannelSplitterNode
Creates a ChannelMergerNode
Creates a MediaStreamAudioSourceNode
from a MediaElement
Creates a MediaElementAudioSourceNode
from a MediaElement
Read more
Creates an AudioBufferSourceNode
Read more
Creates a PannerNode
Creates a AnalyserNode
Creates a periodic wave
fn create_audio_param(
&self,
opts: AudioParamOptions,
dest: &AudioNodeId
) -> (AudioParam, AudioParamId)
fn create_audio_param(
&self,
opts: AudioParamOptions,
dest: &AudioNodeId
) -> (AudioParam, AudioParamId)
Create an AudioParam
. Read more
Returns an AudioDestinationNode
representing the final destination of all audio in the
context. It can be thought of as the audio-rendering device. Read more
Returns the AudioListener
which is used for 3D spatialization
The sample rate (in sample-frames per second) at which the AudioContext
handles audio.
This is the time in seconds of the sample frame immediately following the last sample-frame in the block of audio most recently processed by the context’s rendering graph. Read more