Skip to main content
Android
iOS
macOS
Web
Windows
Electron
Flutter
React Native
React JS
Unity
Unreal Engine
Unreal (Blueprint)

SDK quickstart

This page provides a step-by-step guide on seamlessly integrating the Video SDK and effortlessly building a basic, real-time interactive app from scratch using minimal code.

Understand the tech

This section shows you how to integrate Video Calling features into your app. Following are some basic concepts about real-time audio and video interaction:

  • Video SDK: An SDK developed by Agora to enable developers to add real-time audio and video interaction to their projects.
  • Channel: A channel is used to transmit data. Users in the same channel can interact in real time.
  • Host: A host can publish audio and video in a channel. The host can also subscribe to audio and video published by other hosts.
  • Audience: A user assigned the audience role may subscribe to audio and video within the channel. However, they are not authorized to publish audio or video content.

For more information, see Core Concepts.

The following figure shows the basic workflow you implement to integrate this feature into your app.

Video calling workflow

To start a session, implement the following steps in your app:

  • Retrieve a token: A token is a computer-generated string that authenticates a user when your app joins a channel. In this guide, you retrieve your token from Agora Console. To see how to create an authentication server for development purposes, see Secure authentication with tokens.

  • Join a channel: Call methods to create and join a channel; apps that pass the same channel name join the same channel.

  • Send and receive video and audio: All users send and receive video and audio streams from all users in the channel.

Prerequisites

  • Android Studio 4.2 or higher.
  • Android SDK API Level 21 or higher.
  • Two mobile devices running Android 5.0 or higher.
  • A camera and a microphone
  • The following information from Agora Console:

    • App ID: A unique string generated by Agora that identifies your project.

    • A temporary token: A dynamic key that authenticates a user when the client joins a channel. Note down the channel name you use to generate the token.

      information
      The maximum validity period of a temporary token is 24 hours.

    Please refer to Agora account management for details.

Set up your project

This section shows you how to set up a new Android project and integrate the Agora Video SDK.

Create a project

  1. Create a new project (Optional)

    1. Open Android Studio and select File > New > New Project....
    2. Select Phone and Tablet > Empty Activity and click Next.
    3. Set the project name and storage path.
    4. Select the language as Java, and click Finish to create the Android project.

    Note
    After you create a project, Android Studio automatically starts gradle sync. Ensure that the synchronization is successful before proceeding to the next step.
  2. Add network and device permissions

    Open the /app/src/main/AndroidManifest.xml file and add the following permissions after </application>:


    _15
    <!--Required permissions-->
    _15
    <uses-permission android:name="android.permission.INTERNET"/>
    _15
    _15
    <!--Optional permissions-->
    _15
    <uses-permission android:name="android.permission.CAMERA"/>
    _15
    <uses-permission android:name="android.permission.RECORD_AUDIO"/>
    _15
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
    _15
    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
    _15
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
    _15
    <uses-permission android:name="android.permission.BLUETOOTH"/>
    _15
    <!-- For devices running Android 12 (API level 32) or higher and integrating Agora Video SDK version v4.1.0 or lower, you also need to add the following permissions -->
    _15
    <uses-permission android:name="android.permission.BLUETOOTH_CONNECT"/>
    _15
    <!-- For Android 12.0 or higher, the following permissions are also required -->
    _15
    <uses-permission android:name="android.permission.READ_PHONE_STATE"/>
    _15
    <uses-permission android:name="android.permission.BLUETOOTH_SCAN"/>

  3. Prevent code obfuscation

    Open the /app/proguard-rules.pro file and add the following lines to prevent the Video SDK code from being obfuscated:


    _2
    -keep class io.agora.**{*;}
    _2
    -dontwarn io.agora.**

Integrate the SDK

Choose either of the following methods to integrate Video SDK into your project.

  1. Open the settings.gradle file in the project's root directory and add the Maven Central dependency, if it doesn't already exist:


    _3
    repositories {
    _3
    mavenCentral()
    _3
    }

    Note

    If your Android project uses dependencyResolutionManagement, the method of adding the Maven Central dependency may differ.

  2. Open the /app/build.gradle file and add the Video SDK under dependencies. Check the latest version of the SDK from the release notes and replace x.y.z with the specific version number.


    _7
    dependencies {
    _7
    // Replace x.y.z with a specific SDK version number. For example, 4.3.2
    _7
    implementation 'io.agora.rtc:full-sdk:x.y.z'
    _7
    _7
    // To integrate the Lite SDK, use the following line instead
    _7
    // implementation 'io.agora.rtc:lite-sdk:4.4.1'
    _7
    }

    Obtain the latest version number from the release notes.

Create a user interface

Set up two FrameLayout elements in your project to display local and remote video streams. Use the following code to generate the essential user interface. Paste the code into the /app/src/main/res/layout/activity_main.xml file, replacing the existing content.

Sample code to create the user interface
<?xml version="1.0" encoding="utf-8"?><androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"   xmlns:app="http://schemas.android.com/apk/res-auto"   xmlns:tools="http://schemas.android.com/tools"   android:layout_width="match_parent"   android:layout_height="match_parent"   tools:context=".MainActivity">   <TextView      android:layout_width="wrap_content"      android:layout_height="wrap_content"      android:text="Start Video Call!"      app:layout_constraintBottom_toBottomOf="parent"      app:layout_constraintLeft_toLeftOf="parent"      app:layout_constraintRight_toRightOf="parent"      app:layout_constraintTop_toTopOf="parent" />   <FrameLayout      android:id="@+id/local_video_view_container"      android:layout_width="match_parent"      android:layout_height="match_parent"      android:background="@android:color/white" />   <FrameLayout      android:id="@+id/remote_video_view_container"      android:layout_width="160dp"      android:layout_height="160dp"      android:layout_alignParentEnd="true"      android:layout_alignParentRight="true"      android:layout_alignParentTop="true"      android:layout_marginEnd="16dp"      android:layout_marginRight="16dp"      android:layout_marginTop="16dp"      android:background="@android:color/darker_gray"      tools:ignore="MissingConstraints" /></androidx.constraintlayout.widget.ConstraintLayout>

UI design

Implement Video Calling

This section guides you through the implementation of basic real-time audio and video interaction in your app.

The following figure illustrates the essential steps:

A complete code sample that implements the basic process of real-time interaction is presented here for your reference. Copy the sample code into your project to quickly implement the basic functions of real-time interaction.

To use the sample code, copy the following lines into the /app/src/main/java/com/example/<projectname>/MainActivity.java file and replace <projectname> in package com.example.<projectname> with the name of your project.

Complete sample code for real-time Video Calling
import android.Manifest;import android.content.pm.PackageManager;import android.os.Bundle;import android.view.SurfaceView;import android.widget.FrameLayout;import android.widget.Toast;import androidx.annotation.NonNull;import androidx.appcompat.app.AppCompatActivity;import androidx.core.app.ActivityCompat;import androidx.core.content.ContextCompat;import io.agora.rtc2.ChannelMediaOptions;import io.agora.rtc2.Constants;import io.agora.rtc2.IRtcEngineEventHandler;import io.agora.rtc2.RtcEngine;import io.agora.rtc2.RtcEngineConfig;import io.agora.rtc2.video.VideoCanvas;public class MainActivity extends AppCompatActivity {    // Fill in the App ID obtained from the Agora Console    private String appId = "<Your app ID>";    // Fill in the channel name    private String channelName = "<Your channel name>";    // Fill in the temporary token generated from Agora Console    private String token = "<Your token>";    private RtcEngine mRtcEngine;    private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {        // Callback when successfully joining the channel        @Override        public void onJoinChannelSuccess(String channel, int uid, int elapsed) {            super.onJoinChannelSuccess(channel, uid, elapsed);            runOnUiThread(() -> {                Toast.makeText(MainActivity.this, "Join channel success", Toast.LENGTH_SHORT).show();            });        }        // Callback when a remote user or host joins the current channel        @Override        public void onUserJoined(int uid, int elapsed) {            runOnUiThread(() -> {                // When a remote user joins the channel, display the remote video stream for the specified uid                setupRemoteVideo(uid);            });        }        // Callback when a remote user or host leaves the current channel        @Override        public void onUserOffline(int uid, int reason) {            super.onUserOffline(uid, reason);            runOnUiThread(() -> {                Toast.makeText(MainActivity.this, "User offline: " + uid, Toast.LENGTH_SHORT).show();            });        }    };    private void initializeAndJoinChannel() {        try {            // Create an RtcEngineConfig instance and configure it            RtcEngineConfig config = new RtcEngineConfig();            config.mContext = getBaseContext();            config.mAppId = appId;            config.mEventHandler = mRtcEventHandler;            // Create and initialize an RtcEngine instance            mRtcEngine = RtcEngine.create(config);        } catch (Exception e) {            throw new RuntimeException("Check the error.");        }        // Enable the video module        mRtcEngine.enableVideo();                // Enable local preview        mRtcEngine.startPreview();        // Create a SurfaceView object and make it a child object of FrameLayout        FrameLayout container = findViewById(R.id.local_video_view_container);        SurfaceView surfaceView = new SurfaceView (getBaseContext());        container.addView(surfaceView);        // Pass the SurfaceView object to the SDK and set the local view        mRtcEngine.setupLocalVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, 0));        // Create an instance of ChannelMediaOptions and configure it        ChannelMediaOptions options = new ChannelMediaOptions();        // Set the user role to BROADCASTER or AUDIENCE according to the scenario        options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER;        // In the video calling scenario, set the channel profile to CHANNEL_PROFILE_COMMUNICATION        options.channelProfile = Constants.CHANNEL_PROFILE_COMMUNICATION;        // Join the channel using a temporary token and channel name, setting uid to 0 means the engine will randomly generate a username        // The onJoinChannelSuccess callback will be triggered upon success        mRtcEngine.joinChannel(token, channelName, 0, options);    }    private void setupRemoteVideo(int uid) {        FrameLayout container = findViewById(R.id.remote_video_view_container);        SurfaceView surfaceView = new SurfaceView (getBaseContext());        surfaceView.setZOrderMediaOverlay(true);        container.addView(surfaceView);        // Pass the SurfaceView object to the SDK and set the remote view        mRtcEngine.setupRemoteVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, uid));    }    private static final int PERMISSION_REQ_ID = 22;    // Obtain recording, camera and other permissions required to implement real-time audio and video interaction    private String[] getRequiredPermissions(){        // Determine the permissions required when targetSDKVersion is 31 or above        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.S) {            return new String[]{                    Manifest.permission.RECORD_AUDIO, // Recording permission                    Manifest.permission.CAMERA, // Camera permission                    Manifest.permission.READ_PHONE_STATE, // Permission to read phone status                    Manifest.permission.BLUETOOTH_CONNECT // Bluetooth connection permission            };        } else {            return new String[]{                    Manifest.permission.RECORD_AUDIO,                    Manifest.permission.CAMERA            };        }    }    private boolean checkPermissions() {        for (String permission : getRequiredPermissions()) {            int permissionCheck = ContextCompat.checkSelfPermission(this, permission);            if (permissionCheck != PackageManager.PERMISSION_GRANTED) {                return false;            }        }        return true;    }    // System permission request callback    @Override    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {        super.onRequestPermissionsResult(requestCode, permissions, grantResults);        if (checkPermissions()) {            initializeAndJoinChannel();        }    }    @Override    protected void onCreate(Bundle savedInstanceState) {        super.onCreate(savedInstanceState);        setContentView(R.layout.activity_main);        // If authorized, initialize RtcEngine and join the channel        if (checkPermissions()) {            initializeAndJoinChannel();        } else {            ActivityCompat.requestPermissions(this, getRequiredPermissions(), PERMISSION_REQ_ID);        }    }    // System permission request callback    @Override    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {        super.onRequestPermissionsResult(requestCode, permissions, grantResults);        if (checkPermissions()) {            initializeAndJoinChannel();        }    }    @Override    protected void onDestroy() {        super.onDestroy();        // Stop local video preview        mRtcEngine.stopPreview();        // Leave the channel        mRtcEngine.leaveChannel();    }}
info
For the appId and token variables, replace the placeholders with the values you obtained from Agora Console. Ensure you enter the same channelName you used when generating the temporary token.

Follow the implementation steps to understand the core API calls in the sample code.

Handle permissions

This section describes how to import Android classes and obtain the camera, recording, and other permissions of Android devices.

  1. Import Android classes


    _8
    import androidx.appcompat.app.AppCompatActivity;
    _8
    import androidx.core.app.ActivityCompat;
    _8
    import androidx.core.content.ContextCompat;
    _8
    _8
    import android.Manifest;
    _8
    import android.content.pm.PackageManager;
    _8
    import android.view.SurfaceView;
    _8
    import android.widget.FrameLayout;

  2. Get Android permissions

    When launching the app, check that the required permissions for real-time interaction have been granted in the app.


    _19
    private static final int PERMISSION_REQ_ID = 22;
    _19
    _19
    // Obtain recording, camera and other permissions required to implement real-time audio and video interaction
    _19
    private String[] getRequiredPermissions(){
    _19
    // Determine the permissions required when targetSDKVersion is 31 or above
    _19
    if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.S) {
    _19
    return new String[]{
    _19
    Manifest.permission.RECORD_AUDIO, // Recording permission
    _19
    Manifest.permission.CAMERA, // Camera permission
    _19
    Manifest.permission.READ_PHONE_STATE, // Permission to read phone status
    _19
    Manifest.permission.BLUETOOTH_CONNECT // Bluetooth connection permission
    _19
    };
    _19
    } else {
    _19
    return new String[]{
    _19
    Manifest.permission.RECORD_AUDIO,
    _19
    Manifest.permission.CAMERA
    _19
    };
    _19
    }
    _19
    }

Import Agora classes

Import the relevant Agora classes and interfaces:


_6
import io.agora.rtc2.Constants;
_6
import io.agora.rtc2.IRtcEngineEventHandler;
_6
import io.agora.rtc2.RtcEngine;
_6
import io.agora.rtc2.RtcEngineConfig;
_6
import io.agora.rtc2.video.VideoCanvas;
_6
import io.agora.rtc2.ChannelMediaOptions;

Declare variables

Pass in the app ID and the temporary token obtained from Agora Console. Use the channel name you filled in when generating the temporary token. You use these values to initialize the engine and join a channel.


_6
// Fill in the app ID from Agora Console
_6
private String appId = "<Your app ID>";
_6
// Fill in the channel name
_6
private String channelName = "<Your channel name>";
_6
// Fill in the temporary token generated from Agora Console
_6
private String token = "<#Your token#>";

Initialize the engine

Call create to initialize an RtcEngine instance.


_23
private RtcEngine mRtcEngine;
_23
_23
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
_23
@Override
_23
// Monitor remote users in the channel and obtain their uid
_23
public void onUserJoined(int uid, int elapsed) {
_23
runOnUiThread(new Runnable() {
_23
@Override
_23
public void run() {
_23
// After obtaining uid, set up the remote video view
_23
setupRemoteVideo(uid);
_23
}
_23
});
_23
}
_23
};
_23
_23
// Create an RtcEngineConfig instance and configure it
_23
RtcEngineConfig config = new RtcEngineConfig();
_23
config.mContext = getBaseContext();
_23
config.mAppId = appId;
_23
config.mEventHandler = mRtcEventHandler;
_23
// Create and initialize an RtcEngine instance
_23
mRtcEngine = RtcEngine.create(config);

Note

Before initializing the SDK, ensure that the user fully understands and agrees to the relevant privacy policy.

Enable the video module

Follow these steps to enable the video module:

  1. Call enableVideo to enable the video module.
  2. Call startPreview to enable local video preview.
  3. Call setupLocalVideo to initialize the local view and set the local video display properties.

_12
// Enable the video module
_12
mRtcEngine.enableVideo();
_12
_12
// Enable local preview
_12
mRtcEngine.startPreview();
_12
_12
// Create a SurfaceView object and make it a child object of FrameLayout
_12
FrameLayout container = findViewById(R.id.local_video_view_container);
_12
SurfaceView surfaceView = new SurfaceView (getBaseContext());
_12
container.addView(surfaceView);
_12
// Pass the SurfaceView object to the SDK and set the local view
_12
mRtcEngine.setupLocalVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, 0));

Join a channel and publish audio and video streams

Instantiate and configure ChannelMediaOptions and call joinChannel to join a channel. In ChannelMediaOptions, set the channel profile to COMMUNICATION (video calling) and set the user role to BROADCASTER (host) or AUDIENCE (audience).


_10
// Create an instance of ChannelMediaOptions and configure it
_10
ChannelMediaOptions options = new ChannelMediaOptions();
_10
// Set the user role to BROADCASTER or AUDIENCE according to the scenario
_10
options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER;
_10
// In the live broadcast scenario, set the channel profile to COMMUNICATION (live broadcast scenario)
_10
options.channelProfile = Constants.CHANNEL_PROFILE_COMMUNICATION;
_10
_10
// Use the temporary token to join the channel
_10
// Specify the user ID yourself and ensure it is unique within the channel
_10
mRtcEngine.joinChannel(token, channelName, 0, options);

Set up remote video

When a remote user joins the channel, call setupRemoteVideo and pass in the remote user's uid, obtained from the onUserJoined callback, to display the remote video.


_8
private void setupRemoteVideo(int uid) {
_8
FrameLayout container = findViewById(R.id.remote_video_view_container);
_8
SurfaceView surfaceView = new SurfaceView (getBaseContext());
_8
surfaceView.setZOrderMediaOverlay(true);
_8
container.addView(surfaceView);
_8
// Pass the SurfaceView object to the SDK and set the remote view
_8
mRtcEngine.setupRemoteVideo(new VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_FIT, uid));
_8
}

Start and close the app

When a user launches your app, start real-time interaction. When a user closes the app, stop the interaction.

  1. In the onCreate callback, check whether the app has obtained the required permissions. If the permissions have not been granted, request the required permissions from the user. If permissions have been granted, initialize RtcEngine and join a channel.


    _11
    @Override
    _11
    protected void onCreate(Bundle savedInstanceState) {
    _11
    super.onCreate(savedInstanceState);
    _11
    setContentView(R.layout.activity_main);
    _11
    // If authorized, initialize RtcEngine and join the channel
    _11
    if (checkPermissions()) {
    _11
    initializeAndJoinChannel();
    _11
    } else {
    _11
    ActivityCompat.requestPermissions(this, getRequiredPermissions(), PERMISSION_REQ_ID);
    _11
    }
    _11
    }

  1. When a user closes the app, or switches the app to the background, call stopPreview to stop the video preview and then call leaveChannel to leave the current channel and release all session-related resources.

    _8
    @Override
    _8
    protected void onDestroy() {
    _8
    super.onDestroy();
    _8
    // Stop local video preview
    _8
    mRtcEngine.stopPreview();
    _8
    // Leave the channel
    _8
    mRtcEngine.leaveChannel();
    _8
    }

Test the sample code

Take the following steps to test the sample code:

  1. Enable developer options on your Android test device. Turn on USB debugging, connect the Android device to your development machine through a USB cable, and check that your device appears in the Android device options.

  2. In Android Studio, click Sync Project with Gradle Files to resolve project dependencies and update the configuration.

  3. After synchronization is successful, click Run app. Android Studio starts compilation. After a few moments, the app is installed on your Android device.

  1. Launch the App, grant recording and camera permissions. If you set the user role to host, you will see yourself in the local view.
  1. On a second Android device, repeat the previous steps to install and launch the app. Test the following scenarios:
    • If users on both devices join the channel as hosts, they can see and hear each other.
    • If one user joins as host and the other as audience, the host can see themselves in the local video window; the audience can see the host in the remote video window and hear the host.

Reference

This section contains content that completes the information on this page, or points you to documentation that explains other aspects to this product.

Next steps

After implementing the quickstart sample, read the following documents to learn more:

  • To ensure communication security in a test or production environment, best practice is to obtain and use a token from an authentication server. For details, see Secure authentication with tokens.

Video Calling