Use this guide to quickly start the live interactive video streaming with the Agora Video SDK for Android.

Sample project

We provide an open-source OpenLive-Android demo project that implements the basic live interactive video streaming on GitHub.


  • Android Studio 3.0 or later
  • Android SDK API Level 16 or higher
  • A mobile device running Android 4.1 or later
  • A valid Agora account and an App ID
Open the specified ports in Firewall Requirements if your network has a firewall.

Set up the development environment

In this section, we will create an Android project, integrate the Agora Video SDK into the project, and add the Android device permissions to prepare the development environment.

Create an Android project

Now, let's build a project from scratch. Skip to Integrate the SDK if a project already exists.

Create an Android project
  1. Open Android Studio and click Start a new Android Studio project.

  2. On the Select a Project Template panel, choose Phone and Tablet > Empty Activity, and click Next.

  3. On the Configure Your Project panel, fill in the following contents:

    • Name: The name of your project, for example, HelloAgora
    • Package name: The name of the project package, for example, io.agora.helloagora
    • Save location: The path to save the project
    • Language: The programming language of the project, for example, Java
    • Minimum API level: The minimum API level of the project

Click Finish. Follow the on-screen instructions, if any, to install the plug-ins.

The above steps take Android Studio 3.6.2 as an example. To create a project, you can also refer to the official User Guide Build your first app.

Integrate the SDK

Choose either of the following methods to integrate the Agora Video SDK into your project.

Method 1: Automatically integrate the SDK with JCenter

Add the following line in the /app/build.gradle file of your project:

dependencies {
    // For x.y.z, please fill in a specific SDK version number, such as 3.0.0.
    // Get the latest version number through the release notes.
    implementation 'io.agora.rtc:full-sdk:x.y.z'
Click Release notes to get the latest version number.

Method 2: Manually copy the SDK files

  1. Go to SDK Downloads, download the latest version of the Agora Video SDK, and unzip the downloaded SDK package.
  2. Copy the following files or subfolders from the libs folder of the downloaded SDK package to the path of your project.
File or subfolder Path of your project
agora-rtc-sdk.jar file /app/libs/
arm-v8a folder /app/src/main/jniLibs/
armeabi-v7a folder /app/src/main/jniLibs/
x86 folder /app/src/main/jniLibs/
x86_64 folder /app/src/main/jniLibs/
  • If your project does not use the encryption function, we recommend deleting the file in the SDK package.
  • If you use the armeabi architecture, copy files from the armeabi-v7a folder to the armeabi file of your project. Contact if you encouter any incompatibility issue.

Add project permissions

Add the following permissions in the /app/src/main/AndroidManifest.xml file for device access according to your needs:

<manifest xmlns:android=""

   <uses-permission android:name="android.permission.READ_PHONE_STATE" />   
   <uses-permission android:name="android.permission.INTERNET" />
   <uses-permission android:name="android.permission.RECORD_AUDIO" />
   <uses-permission android:name="android.permission.CAMERA" />
   <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
   <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
   <uses-permission android:name="android.permission.BLUETOOTH" />
   <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
   // Add the following permission if your scenario involves reading the external storage:
   <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
   // For devices running Android 10.0 or later, you also need to add the following permission:
   <uses-permission android:name="android.permission.READ_PRIVILEGED_PHONE_STATE" />


If your targetSdkVersion ≥ 29, add the following line in the <application> module in the AndroidManifest.xml file:


Prevent code obfuscation

Add the following line in the app/ file to prevent obfuscating the code of the Agora SDK:

-keep class io.agora.**{*;}

Implement the basic live interactive streaming

This section introduces how to use the Agora Video SDK to start the live interactive video streaming. The following figure shows the API call sequence of the live interactive video streaming.

1. Create the UI

Create the user interface (UI) for the interactive streaming in the layout file of your project. Skip to Import Classes if you already have a UI in your project.

If you are implementing the video streaming, we recommend adding the following elements into the UI:

  • The view of the host
  • The exit button

You can also refer to the xml files under the layout path in the OpenLive-Android demo project.

Example for creating the UI
<?xml version="1.0" encoding="UTF-8"?>
<RelativeLayout xmlns:android=""



            android:src="@drawable/icon_agora_large" />






2. Import Classes

Import the following classes in the activity file of your project:

import io.agora.rtc.IRtcEngineEventHandler;
import io.agora.rtc.RtcEngine;


3. Get the device permission

Call the checkSelfPermission method to access the camera and the microphone of the Android device when launching the activity.

private static final int PERMISSION_REQ_ID = 22;

// Ask for Android device permissions at runtime.
private static final String[] REQUESTED_PERMISSIONS = {

protected void onCreate(Bundle savedInstanceState) {

    // If all the permissions are granted, initialize the RtcEngine object and join a channel.
    if (checkSelfPermission(REQUESTED_PERMISSIONS[0], PERMISSION_REQ_ID) undefined
            checkSelfPermission(REQUESTED_PERMISSIONS[2], PERMISSION_REQ_ID)) {

private boolean checkSelfPermission(String permission, int requestCode) {
    if (ContextCompat.checkSelfPermission(this, permission) !=
            PackageManager.PERMISSION_GRANTED) {
        ActivityCompat.requestPermissions(this, REQUESTED_PERMISSIONS, requestCode);
        return false;

    return true;

4. Initialize RtcEngine

Create and initialize the RtcEngine object before calling any other Agora APIs.

In the string.xml file, replace agora_app_id with your App ID. Call the create method and pass in the App ID to initialize the RtcEngine object.

You can also listen for callback events, such as when the local user joins the channel, and when the first video frame of a host is decoded. Do not implement UI operations in these callbacks.

private RtcEngine mRtcEngine;
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
    // Listen for the onJoinChannelSuccess callback.
    // This callback occurs when the local user successfully joins the channel.
    public void onJoinChannelSuccess(String channel, final int uid, int elapsed) {
        runOnUiThread(new Runnable() {
            public void run() {
                Log.i("agora","Join channel success, uid: " + (uid & 0xFFFFFFFFL));

    // Listen for the onFirstRemoteVideoDecoded callback.
    // This callback occurs when the first video frame of the host is received and decoded after the host successfully joins the channel.
    // You can call the setupRemoteVideo method in this callback to set up the remote video view.
    public void onFirstRemoteVideoDecoded(final int uid, int width, int height, int elapsed) {
        runOnUiThread(new Runnable() {
            public void run() {
                Log.i("agora","First remote video decoded, uid: " + (uid & 0xFFFFFFFFL));

    // Listen for the onUserOffline callback.
    // This callback occurs when the host leaves the channel or drops offline.
    public void onUserOffline(final int uid, int reason) {
        runOnUiThread(new Runnable() {
            public void run() {
                Log.i("agora","User offline, uid: " + (uid & 0xFFFFFFFFL));


// Initialize the RtcEngine object.
private void initializeEngine() {
    try {
        mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
    } catch (Exception e) {
        Log.e(TAG, Log.getStackTraceString(e));
        throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));

5. Set the channel profile

After initializing the RtcEngine object, call the setChannelProfile method to set the channel profile as LIVE_BROADCASTING.

One RtcEngine object uses one profile only. If you want to switch to another profile, release the current RtcEngine object with the destroy method and create a new one before calling the setChannelProfile method.

private void setChannelProfile() {

6. Set the user role

A live-streaming channel has two user roles: BROADCASTER and AUDIENCE, and the default role is AUDIENCE. After setting the channel profile to LIVE_BROADCASTING, your app may use the following steps to set the client role:

  1. Allow the user to set the role as BROADCASTER or AUDIENCE.
  2. Call the setClientRole method and pass in the client role set by the user.

Note that in the live interactive streaming, only the host can be heard and seen. If you want to switch the user role after joining the channel, call the setClientRole method.

public void onClickJoin(View view) {
    // Show a dialog box to choose a user role.
    AlertDialog.Builder builder = new AlertDialog.Builder(this);
    builder.setNegativeButton(R.string.label_audience, new DialogInterface.OnClickListener() {
        public void onClick(DialogInterface dialog, int which) {
    builder.setPositiveButton(R.string.label_broadcaster, new DialogInterface.OnClickListener() {
        public void onClick(DialogInterface dialog, int which) {
    AlertDialog dialog = builder.create();;

// Get the user role and channel name specified by the user.
// The channel name is used when joining the channel.
public void forwardToLiveRoom(int cRole) {
    final EditText v_room = (EditText) findViewById(;
    String room = v_room.getText().toString();

    Intent i = new Intent(MainActivity.this, LiveRoomActivity.class);
    i.putExtra("CRole", cRole);
    i.putExtra("CName", room);


// Pass in the user role set by the user.
private int mRole;
mRole = getIntent().getIntExtra("CRole", 0);

private void setClientRole() {

7. Set the local video view

After setting the channel profile and user role, set the local video view before joining the channel so that the host can see the local video in the live streaming. Follow these steps to configure the local video view:

  1. Call the enableVideo method to enable the video module.
  2. Call the createRendererView method to create a SurfaceView object.
  3. Call the setupLocalVideo method to configure the local video display settings.
private void setupLocalVideo() {

    // Enable the video module.

    // Create a SurfaceView object.
    private FrameLayout mLocalContainer;
    private SurfaceView mLocalView;

    mLocalView = RtcEngine.CreateRendererView(getBaseContext());
    // Set the local video view.
    VideoCanvas localVideoCanvas = new VideoCanvas(mLocalView, VideoCanvas.RENDER_MODE_HIDDEN, 0);

8. Join a channel

After setting the user role and the local video view (for the video live streaming), you can call the joinChannel method to join a channel. In this method, set the following parameters:

  • token: Pass a token that identifies the role and privilege of the user. You can set it as one of the following values:

    • NULL.
    • A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see Get a Temporary Token.
    • A token generated at the server. This applies to scenarios with high-security requirements. For details, see Generate a token from Your Server.
    If your project has enabled the app certificate, ensure that you provide a token.
  • channelName: Specify the channel name that you want to join.

  • uid: ID of the local user that is an integer and should be unique. If you set uid as 0, the SDK assigns a user ID for the local user and returns it in the onJoinChannelSuccess callback.

For more details on the parameter settings, see joinChannel.

If a live-streaming channel uses both the RTC Native SDK and the RTC Web SDK, ensure that you call the enableWebSdkInteroperability method before joining the channel.

private void joinChannel() {

    // For SDKs earlier than v3.0.0, call this method to enable interoperability between the Native SDK and the Web SDK if the Web SDK is in the channel. As of v3.0.0, the Native SDK enables the interoperability with the Web SDK by default.

    // Join a channel with a token.
    private String mRoomName;
    mRoomName = getIntent().getStringExtra("CName");
    mRtcEngine.joinChannel(YOUR_TOKEN, mRoomName, "Extra Optional Data", 0);

9. Set the remote video view

In the video streaming you should also be able to see all the hosts, regardless of your role. This is achieved by calling the setupRemoteVideo method after joining the channel. The difference between setting the local and remote view is that you need to specify the ID of the remote user when setting up the remote view.

Shortly after a remote host joins the channel, the SDK gets the host's user ID in the onFirstRemoteVideoDecoded callback. Call the setupRemoteVideo method in the callback and pass in the uid to set the video view of the host.

    // Listen for the onFirstRemoteVideoDecoded callback.
    // This callback occurs when the first video frame of the host is received and decoded after the host successfully joins the channel.
    public void onFirstRemoteVideoDecoded(final int uid, int width, int height, int elapsed) {
        runOnUiThread(new Runnable() {
            public void run() {
                Log.i("agora","First remote video decoded, uid: " + (uid & 0xFFFFFFFFL));

private void setupRemoteVideo(int uid) {

    // Create a SurfaceView object.
    private RelativeLayout mRemoteContainer;
    private SurfaceView mRemoteView;

    mRemoteView = RtcEngine.CreateRendererView(getBaseContext());
    // Set the remote video view.
    mRtcEngine.setupRemoteVideo(new VideoCanvas(mRemoteView, VideoCanvas.RENDER_MODE_HIDDEN, uid));


10. Additional steps

You can implement more advanced features and functionalities in the live interactive streaming.

Mute the local audio

Call the muteLocalAudioStream method to stop or resume sending the local audio stream to mute or unmute the local user.

public void onLocalAudioMuteClicked(View view) {
    mMuted = !mMuted;
Switch the camera direction

Call the switchCamera method to switch the direction of the camera.

public void onSwitchCameraClicked(View view) {

11. Leave the channel

Call the leaveChannel method to leave the current channel according to your scenario, for example, when the live interactive streaming ends, when you need to close the app, or when your app runs in the background.

protected void onDestroy() {
    if (!mCallEnd) {

private void leaveChannel() {
    // Leave the current channel.

Sample code

You can find the complete code logic in the OpenLive-Android demo project.

Run the project

Run the project on your Android device. When you set the role as the host and successfully start the live interactive streaming, you can see the video view of yourself in the app. When you set the role as the audience and successfully join the live interactive streaming, you can see the video view of the host in the app.