title | description | sourceCodeUrl | packageName | iconUrl | platforms | ||||
---|---|---|---|---|---|---|---|---|---|
AV |
A universal library that provides separate APIs for Audio and Video playback. |
expo-av |
/static/images/packages/expo-av.png |
|
import { APIInstallSection } from '/components/plugins/InstallSection';
import APISection from '/components/plugins/APISection';
import {
ConfigReactNative,
ConfigPluginExample,
ConfigPluginProperties,
} from '/components/plugins/ConfigSection';
import { AndroidPermissions, IOSPermissions } from '/components/plugins/permissions';
The Audio.Sound
objects and Video
components share a unified imperative API for media playback.
Note that for Video
, all of the operations are also available via props on the component. However, we recommend using this imperative playback API for most applications where finer control over the state of the video playback is needed.
Try the playlist example app (source code is on GitHub) to see an example usage of the playback API for both Audio.Sound
and Video
.
Info Audio recording APIs are not available on tvOS (Apple TV).
You can configure expo-av
using its built-in config plugin if you use config plugins in your project (EAS Build or npx expo run:[android|ios]
). The plugin allows you to configure various properties that cannot be set at runtime and require building a new app binary to take effect.
{
"expo": {
"plugins": [
[
"expo-av",
{
"microphonePermission": "Allow $(PRODUCT_NAME) to access your microphone."
}
]
]
}
}
<ConfigPluginProperties
properties={[
{
name: 'microphonePermission',
platform: 'ios',
description:
'A string to set the NSMicrophoneUsageDescription
permission message.',
default: '"Allow $(PRODUCT_NAME) to access your microphone"',
},
]}
/>
Learn how to configure the native projects in the installation instructions in the expo-av
repository.
On this page, we reference operations on playbackObject
s. Here is an example of obtaining access to the reference for both sound and video:
await Audio.setAudioModeAsync({ playsInSilentModeIOS: true });
const playbackObject = new Audio.Sound();
// OR
const { sound: playbackObject } = await Audio.Sound.createAsync(
{ uri: 'http://foo/bar.mp3' },
{ shouldPlay: true }
);
See the audio documentation for further information on Audio.Sound.createAsync()
.
/* @hide ... */ /* @end */
_handleVideoRef = component => {
const playbackObject = component;
...
}
/* @hide ... */ /* @end */
render() {
return (
<Video
ref={this._handleVideoRef}
/>
/* @hide ... */ /* @end */
)
}
See the video documentation for further information.
_onPlaybackStatusUpdate = playbackStatus => {
if (!playbackStatus.isLoaded) {
// Update your UI for the unloaded state
if (playbackStatus.error) {
console.log(`Encountered a fatal error during playback: ${playbackStatus.error}`);
// Send Expo team the error on Slack or the forums so we can help you debug!
}
} else {
// Update your UI for the loaded state
if (playbackStatus.isPlaying) {
// Update your UI for the playing state
} else {
// Update your UI for the paused state
}
if (playbackStatus.isBuffering) {
// Update your UI for the buffering state
}
if (playbackStatus.didJustFinish && !playbackStatus.isLooping) {
// The player has just finished playing and will stop. Maybe you want to play something else?
}
/* @hide ... */ /* @end */
}
};
// Load the playbackObject and obtain the reference.
playbackObject.setOnPlaybackStatusUpdate(this._onPlaybackStatusUpdate);
const N = 20;
/* @hide ... */ /* @end */
_onPlaybackStatusUpdate = playbackStatus => {
if (playbackStatus.didJustFinish) {
if (this.state.numberOfLoops == N - 1) {
playbackObject.setIsLooping(false);
}
this.setState({ numberOfLoops: this.state.numberOfLoops + 1 });
}
};
/* @hide ... */ /* @end */
this.setState({ numberOfLoops: 0 });
// Load the playbackObject and obtain the reference.
playbackObject.setOnPlaybackStatusUpdate(this._onPlaybackStatusUpdate);
playbackObject.setIsLooping(true);
When asked to seek an A/V item, native player in iOS sometimes may seek to a slightly different time. This technique, mentioned in Apple documentation, is used to shorten the time of the seekTo
call (the player may decide to play immediately from a different time than requested, instead of decoding the exact requested part and playing it with the decoding delay).
If precision is important, you can specify the tolerance with which the player will seek. However, this will result in an increased delay.
import { Audio, Video } from 'expo-av';
You must add the following permissions to your app.json inside the expo.android.permissions
array.
<AndroidPermissions permissions={['RECORD_AUDIO']} />
The following usage description keys are used by this library:
<IOSPermissions permissions={['NSMicrophoneUsageDescription']} />