The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.
Constructor
AudioContext()- Creates and returns a new
AudioContextobject.
Properties
Also inherits properties from its parent interface, BaseAudioContext.
AudioContext.baseLatencyRead only- Returns the number of seconds of processing latency incurred by the
AudioContextpassing the audio from theAudioDestinationNodeto the audio subsystem. AudioContext.outputLatencyRead only- Returns an estimation of the output latency of the current audio context.
Methods
Also inherits methods from its parent interface, BaseAudioContext.
AudioContext.close()- Closes the audio context, releasing any system audio resources that it uses.
AudioContext.createMediaElementSource()- Creates a
MediaElementAudioSourceNodeassociated with anHTMLMediaElement. This can be used to play and manipulate audio from<video>or<audio>elements. AudioContext.createMediaStreamSource()- Creates a
MediaStreamAudioSourceNodeassociated with aMediaStreamrepresenting an audio stream which may come from the local computer microphone or other sources. AudioContext.createMediaStreamDestination()- Creates a
MediaStreamAudioDestinationNodeassociated with aMediaStreamrepresenting an audio stream which may be stored in a local file or sent to another computer. AudioContext.createMediaStreamTrackSource()- Creates a
MediaStreamTrackAudioSourceNodeassociated with aMediaStreamrepresenting an media stream track. AudioContext.getOutputTimestamp- Returns a new
AudioTimestampobject containing two correlated context's audio stream position values. AudioContext.suspend()- Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.
Examples
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillatorNode = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); var finish = audioCtx.destination; // etc.
Specifications
| Specification | Status | Comment |
|---|---|---|
| Web Audio API The definition of 'AudioContext' in that specification. |
Working Draft |
Browser compatibility
| Feature | Chrome | Edge | Firefox | Internet Explorer | Opera | Safari |
|---|---|---|---|---|---|---|
| Basic Support | 35 14 — 57 webkit | (Yes) | 25 | No | 22 15 — 44 webkit | 6 webkit |
AudioContext() constructor | 55 | ? | 25 | No | 42 | ? |
outputLatency | (Yes) | ? | No | No | (Yes) | No |
close | 43 | ? | 40 | No | (Yes) | ? |
createMediaElementSource | 14 | (Yes) | 25 | No | 15 | 6 |
createMediaStreamSource | 14 | (Yes) | 25 | No | 15 | 6 |
createMediaStreamDestination | 14 | (Yes) | 25 | No | 15 | 6 |
createMediaStreamTrackSource | ? | ? | No | No | ? | No |
suspend | 43 | ? | 40 | No | (Yes) | ? |
| Feature | Android | Chrome for Android | Edge mobile | Firefox for Android | IE mobile | Opera Android | iOS Safari |
|---|---|---|---|---|---|---|---|
| Basic Support | (Yes) | 35 14 — 57 webkit | (Yes) | 26 | No | 22 15 — 44 webkit | ? |
AudioContext() constructor | 55 | 55 | ? | 25 | No | 42 | ? |
outputLatency | (Yes) | (Yes) | ? | No | No | (Yes) | ? |
close | 43 | 43 | ? | 40 | No | (Yes) | ? |
createMediaElementSource | (Yes) | 14 | (Yes) | 26 | No | 15 | ? |
createMediaStreamSource | (Yes) | 14 | (Yes) | 26 | No | 15 | ? |
createMediaStreamDestination | (Yes) | 14 | (Yes) | 26 | No | 15 | ? |
createMediaStreamTrackSource | ? | ? | ? | No | No | ? | No |
suspend | 43 | 43 | ? | 40 | No | (Yes) | ? |

