This page only provides information about audio capabilities in WebGLA JavaScript API that renders 2D and 3D graphics in a web browser. The Unity WebGL build option allows Unity to publish content as JavaScript programs which use HTML5 technologies and the WebGL rendering API to run Unity content in a web browser. More info
See in Glossary. To learn how to use audio in your Unity project, refer to Audio Overview.
Because Unity uses FMODAudio in Unity is built on top of a middleware called FMOD. FMOD is integrated with the Unity engine for creating and playing back interactive audio.
See in Glossary to manage audio for platforms, Unity WebGL supports limited audio functionality, which only includes the basic features. FMOD relies on threads, which WebGL doesn’t support. For this reason, Unity uses an implementation based on the internal Web Audio API, which enables the browser to handle audio playback and mixing.
Note: Google Chrome’s new Autoplay policy prevents autoplay of audio and video under certain conditions. For example, while your game might be set to autoplay some background music soon after the game loads, it won’t play automatically unless you click or tap on the website. For more information on how to enable or disable this policy, refer to Google Chrome’s documentation on Autoplay policy in Chrome.
Unity WebGL supports the following API classes:
Class | WebGL Support status |
---|---|
AudioSource | WebGL supports some APIs. Refer to AudioSource for specific support details. |
AudioListener | All APIs supported. |
AudioClip | WebGL supports some APIs. Refer to AudioClip for specific support details. |
AudioMixer | WebGL supports some APIs. Refer to Audio Mixer for specific support details. |
SystemInfo.supportsAudio | The browser provides audio support for WebGL. For this reason, SystemInfo.supportsAudio is always true. |
Microphone | Not supported. |
The AudioSource API supports basic positional audio playback, including:
Unity WebGL supports the following AudioSource APIs:
Settings | Description |
---|---|
Clip | Determines the audio clipA container for audio data in Unity. Unity supports mono, stereo and multichannel audio assets (up to eight channels). Unity can import .aif, .wav, .mp3, and .ogg audio file format, and .xm, .mod, .it, and .s3m tracker module formats. More info See in Glossary that plays next. |
dopplerLevel | Sets the Doppler scale for the AudioSource. |
ignoreListenerPause | Allows AudioSource to ignore AudioListener.pause and continue to play audio. |
ignoreListenerVolume | Ignores the end-user’s AudioSource volume. |
isPlaying | Returns true if the AudioSource.clip is playing. |
loop | Allows the application to loop the AudioSource.clip . |
maxDistance | Sets the maximum distance at which the AudioSource.clip stops attenuating or becomes inaudible. |
minDistance | Sets the minimum distance at which the AudioSource.clip no longer rises in volumes. The sound starts to attenuate beyond the minimum distance. |
mute | Mutes the AudioSource. |
pitch | Sets the pitch of the AudioSource.clip . WebGL only supports positive pitch values. |
playOnAwake | Plays the AudioSource on Awake. |
rolloffMode | Sets the AudioSource attenuation over distance. |
time | Sets the playback position in seconds. |
timeSamples | Sets the playback position in Pulse-code modulation (PCM) samples. |
velocityUpdateMode | Sets whether the AudioSource updates in the fixed or dynamic update loop. |
volume | Sets the volume of the AudioSource (0.0 to 1.0). |
Pause | Pauses the AudioSource.clip . |
Play | Plays the AudioSource.clip . |
PlayDelayed | Plays the AudioSource.clip with a delay you specify in seconds. |
PlayOneShot | Plays an AudioClip and scales the AudioSource volume by volumeScale. |
PlayScheduled | Plays the AudioSource at a time you specify. |
SetScheduledEndTime | Sets a time that a scheduled AudioSource.clip ends. |
SetScheduledStartTime | Sets the time that a scheduled AudioSource.clip starts. |
Stop | Stops playing the AudioSource.clip . |
UnPause | Unpauses a paused AudioSource.clip . |
PlayClipAtPoint | Plays an AudioSource.clip at a given position in the worldspace. |
Unity WebGL imports AudioClip files in the AAC Format, which is supported by most browsers. Unity WebGL supports the following AudioClip APIs:
Properties | Description |
---|---|
length | The length of the AudioClip in seconds. |
loadState | Returns the current load state of the audio data associated with an AudioClip. |
samples | The length of the AudioClip in samples. |
loadType | The load type of the clip. You can set the AudioClip load type in the InspectorA Unity window that displays information about the currently selected GameObject, asset or project settings, allowing you to inspect and edit the values. More info See in Glossary. |
Method | Description | Additional information |
---|---|---|
AudioClip.Create | Creates an AudioClip with a name and length you specify. | Unity WebGL partially supports AudioClip.Create . Browsers don’t support dynamic streaming, so to use AudioClip.Create , set the Stream to false. |
AudioClip.SetData | Sets sample data in an AudioSource.clip. | Unity WebGL partially supports AudioClip.SetData . You can use this method only on compressed audio files with Load Type set to Decompress on Load. Refer to Compressed audio. |
AudioClip.GetData | Retrieves an array with sample data from an AudioSource.clip. | Unity WebGL partially supports AudioClip.GetData . You can use this method only on compressed audio files with Load Type set to Decompress on Load. Refer to Compressed audio. |
Note: For audio clip support on Linux, make sure you’ve installed the ffmpeg package.
Unity WebGL supports some functionality of Audio Mixer assets.
You can do the following with Audio Mixers on Unity WebGL:
Note: Volume is the only property you can change on Unity WebGL. Other properties and sound effects aren’t supported.
To use compressed audio with WebGL in Unity, set the AudioClip loadType to one of the following options:
CompressionA method of storing data that reduces the amount of storage space it requires. See Texture Compression, Animation Compression, Audio Compression, Build Compression. See in Glossary method |
Description | Considerations |
---|---|---|
CompressedInMemory | Use this to compress the audio on disk and have it remain compressed after it loads into your application memory. | Compressed audio can cause latency and is less precise when it comes to audio playback. However, compressed audio uses less memory in your application than decompressed audio. It’s best practice to use CompressedInMemory for audio that’s unaffected by precision for example, background music. |
DecompressOnLoad | Use this to compress the audio on disk, similar to CompressedInMemory, and decompress when it loads into your application memory. | Decompressed audio uses a significant amount of memory compared to compressed audio but has lower latency and more audio flexibility. Use DecompressedOnLoad for audio that’s affected by precision (for example, character dialog or sound effects). |
For security reasons, browsers don’t allow audio playback until an end user interacts with your application webpage via a mouse click, touch event, or key press. Use a loading screen to allow the end user to interact with your application and start audio playback before your main content begins.