English
Appearance
English
Appearance
Related reference: Scene API / Audio
The current advanced Web API mainly provides control over audio objects that already exist in the scene.
In other words, audio is usually configured in the scene editor first, and then the host page uses advanced APIs to:
There is currently no runtime audio-creation API such as createAudio(). If needed, the host page can still create and control its own native Audio element separately.
The most common way to fetch scene audio is by object name:
const audio = await api.getObject('scene-bgm');If it returns null, that usually means:
So a safer time to query is usually after loadSceneEnd or sceneStart:
iframe.addEventListener('loadSceneEnd', async () => {
const audio = await api.getObject('scene-bgm');
console.log(audio);
});await api.playAudio(audio, {
loop: false,
});If needed, you can also control looping through options.loop.
await api.pauseAudio(audio);
await api.stopAudio(audio);
await api.playbackAudio(audio, { loop: false });Their typical differences are:
pauseAudio(): pause at the current positionstopAudio(): stop and jump back to the beginningplaybackAudio(): restart from the beginningconst duration = await api.getAudioDuration(audio);
const paused = await api.getAudioPaused(audio);
const currentTime = await api.getAudioCurrentTime(audio);
const loop = await api.getAudioLoop(audio);Common uses:
pausedcurrentTime and durationloopawait api.setAudioCurrentTime(audio, 12.5);
await api.setAudioLoop(audio, true);Useful when:
Audio objects support media events similar to video:
play - fires when playback startspause - fires when playback pausesended - fires when playback finishes naturallyExample:
function onPlay(event) {
// https://developer.mozilla.org/en-US/docs/Web/API/Event
}
function onPause(event) {
// https://developer.mozilla.org/en-US/docs/Web/API/Event
}
function onEnded(event) {
// https://developer.mozilla.org/en-US/docs/Web/API/Event
}
await api.on('play', onPlay, audio);
await api.on('pause', onPause, audio);
await api.on('ended', onEnded, audio);A common pattern is:
getObject()playAudio() after the user clicks "Start Now" or another host-layer buttonpauseAudio() / playbackAudio() with host-layer buttonsUseful for:
Current audio support is mainly designed for controlling existing scene audio objects, not creating new audio dynamically from the host page.
Audio is not a normal 3D object, so it cannot participate in hierarchy operations such as addChild() the way models, images, and videos do.
Even though the plugin includes some compatibility handling, audio playback is still affected by browser autoplay policy. Pay special attention to:
So if your project depends on background music or important audio prompts, it is usually best to:
If you plan to control audio from the host page, give audio objects stable and readable names in the scene editor, for example:
scene-bgmbutton-click-sfxresult-voiceoverThat makes later getObject() lookups much more reliable.