Skip to main content

SDK quickstart

Real-time video immerses people in the sights and sounds of human connections, keeping them engaged in your app longer.

Thanks to Agora’s intelligent and global Software Defined Real-time Network (Agora SD-RTN™), you can rely on the highest available video and audio quality.

This page shows the minimum code you need to integrate high-quality, low-latency Interactive Live Streaming features into your app using Video SDK.

Understand the tech

This section explains how you can integrate Interactive Live Streaming features into your app. The following figure shows the workflow you need to integrate this feature into your app.

To start a session, implement the following steps in your app:

  • Retrieve a token: A token is a computer-generated string that authenticates a user when your app joins a channel. In this guide you retrieve your token from Agora Console. To see how to create an authentication server for development purposes, see Implement the authentication workflow. To develop your own token generator and integrate it into your production IAM system, read Token generators.

  • Join a channel: Call methods to create and join a channel; apps that pass the same channel name join the same channel.

Prerequisites

In order to follow this procedure you must have:

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • An Agora account and project.

  • A computer with Internet access.

    Ensure that no firewall is blocking your network communication.

Project setup

To integrate Interactive Live Streaming into your app, do the following:

  1. In Android Studio, create a new Phone and Tablet, Java Android project with an Empty Activity.

    After creating the project, Android Studio automatically starts gradle sync. Ensure that the sync succeeds before you continue.

  2. Integrate the Video SDK into your Android project. To do this:

    1. In /Gradle Scripts/build.gradle (Module: <project name>.app), add the following line under dependencies:

    _5
    dependencies {
    _5
    ...
    _5
    implementation 'io.agora.rtc:<artifact id>:<version>'
    _5
    ...
    _5
    }

    1. Replace <artifact id> and <version> with appropriate values for the latest release. For example, io.agora.rtc:full-sdk:4.0.1.

      You can obtain the latest <artifact id> and <version> information using Maven Central Repository Search.

  3. Add permissions for network and device access.

    In /app/Manifests/AndroidManifest.xml, add the following permissions after </application>:


    _10
    <uses-permission android:name="android.permission.READ_PHONE_STATE"/>
    _10
    <uses-permission android:name="android.permission.INTERNET" />
    _10
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    _10
    <uses-permission android:name="android.permission.CAMERA" />
    _10
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
    _10
    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
    _10
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    _10
    _10
    <!-- The Agora SDK requires Bluetooth permissions in case users are using Bluetooth devices.-->
    _10
    <uses-permission android:name="android.permission.BLUETOOTH" />

  4. To prevent obfuscating the code in Video SDK, add the following line to /Gradle Scripts/proguard-rules.pro:


    _1
    -keep class io.agora.**{*;}

You are ready to add Interactive Live Streaming features to your app.

Implement a client for Interactive Live Streaming

When a user opens the app, you initialize Agora Engine. When the user taps a button, the app joins or leaves a channel. When another user joins the same channel, their video and audio is rendered in the app. This simple workflow enables you to concentrate on implementing Agora features and not UX bells and whistles.

This section shows how to use the Video SDK to implement Interactive Live Streaming into your app, step-by-step.

Implement the user interface

In the interface, create frames for local and remote video, and the UI elements to join and leave a channel. In /app/res/layout/activity_main.xml, replace the contents of the file with the following:

You see errors in your IDE. This is because this layout refers to methods that you create later.

Handle the system logic

Import the necessary Android classes and handle the Android permissions.

  1. Import the Android classes

    In /app/java/com.example.<projectname>/MainActivity, add the following lines after package com.example.<project name>:

  2. Handle Android permissions

    When your app launches, ensure that the permissions necessary to insert Video Calling feature into the app are granted. If the permissions are not granted, use the built-in Android feature to request them; if they are granted, return true.

    In /app/java/com.example.<projectname>/MainActivity, add the following lines before onCreate:


    _16
    private static final int PERMISSION_REQ_ID = 22;
    _16
    private static final String[] REQUESTED_PERMISSIONS =
    _16
    {
    _16
    Manifest.permission.RECORD_AUDIO,
    _16
    Manifest.permission.CAMERA
    _16
    };
    _16
    _16
    private boolean checkSelfPermission()
    _16
    {
    _16
    if (ContextCompat.checkSelfPermission(this, REQUESTED_PERMISSIONS[0]) != PackageManager.PERMISSION_GRANTED ||
    _16
    ContextCompat.checkSelfPermission(this, REQUESTED_PERMISSIONS[1]) != PackageManager.PERMISSION_GRANTED)
    _16
    {
    _16
    return false;
    _16
    }
    _16
    return true;
    _16
    }

  3. Show status updates to your users

    In /app/java/com.example.<projectname>/MainActivity, add the following lines before onCreate.


    _4
    void showMessage(String message) {
    _4
    runOnUiThread(() ->
    _4
    Toast.makeText(getApplicationContext(), message, Toast.LENGTH_SHORT).show());
    _4
    }

Implement the channel logic

The following figure shows the API call sequence of implementing Interactive Live Streaming.

To implement this logic, take the following steps:

  1. Import the Video SDK classes

    In /app/java/com.example.<projectname>/MainActivity, add the following lines after the last import statement:


    _6
    import io.agora.rtc2.Constants;
    _6
    import io.agora.rtc2.IRtcEngineEventHandler;
    _6
    import io.agora.rtc2.RtcEngine;
    _6
    import io.agora.rtc2.RtcEngineConfig;
    _6
    import io.agora.rtc2.video.VideoCanvas;
    _6
    import io.agora.rtc2.ChannelMediaOptions;

  2. Declare the variables that you use to create and join a channel

    In /app/java/com.example.<projectname>/MainActivity, add the following lines to MainActivity class:

  3. Setup Agora Engine

    To implement Interactive Live Streaming, you use Video SDK to create an Agora Engine instance. In /app/java/com.example.<projectname>/MainActivity, add the following code before onCreate.


    _13
    private void setupVideoSDKEngine() {
    _13
    try {
    _13
    RtcEngineConfig config = new RtcEngineConfig();
    _13
    config.mContext = getBaseContext();
    _13
    config.mAppId = appId;
    _13
    config.mEventHandler = mRtcEventHandler;
    _13
    agoraEngine = RtcEngine.create(config);
    _13
    // By default, the video module is disabled, call enableVideo to enable it.
    _13
    agoraEngine.enableVideo();
    _13
    } catch (Exception e) {
    _13
    showMessage(e.toString());
    _13
    }
    _13
    }

  4. Handle and respond to Agora Engine events

    In /app/java/com.example.<projectname>/MainActivity, add the following lines after setupVideoSDKEngine:

  5. Render video from the remote user in the channel

    In Video SDK, the remote preview starts when a remote user subscribes to the channel. When a remote user joins a channel, you display the video stream in the interface. Use mRtcEventHandler to call setupRemoteVideo for remote video rendering. In /app/java/com.example.<projectname>/MainActivity, add the following code after SetupVideoSDKEngine:


    _9
    private void setupRemoteVideo(int uid) {
    _9
    FrameLayout container = findViewById(R.id.remote_video_view_container);
    _9
    remoteSurfaceView = new SurfaceView(getBaseContext());
    _9
    remoteSurfaceView.setZOrderMediaOverlay(true);
    _9
    container.addView(remoteSurfaceView);
    _9
    agoraEngine.setupRemoteVideo(new VideoCanvas(remoteSurfaceView, VideoCanvas.RENDER_MODE_FIT, uid));
    _9
    // Display RemoteSurfaceView.
    _9
    remoteSurfaceView.setVisibility(View.VISIBLE);
    _9
    }

  6. Render video from the local user in the channel

    In /app/java/com.example.<projectname>/MainActivity, add the following lines after setupRemoteVideo:


    _8
    private void setupLocalVideo() {
    _8
    FrameLayout container = findViewById(R.id.local_video_view_container);
    _8
    // Create a SurfaceView object and add it as a child to the FrameLayout.
    _8
    localSurfaceView = new SurfaceView(getBaseContext());
    _8
    container.addView(localSurfaceView);
    _8
    // Pass the SurfaceView object to Agora so that it renders the local video.
    _8
    agoraEngine.setupLocalVideo(new VideoCanvas(localSurfaceView, VideoCanvas.RENDER_MODE_HIDDEN, 0));
    _8
    }

  7. Join the channel to start Interactive Live Streaming

    When the local user clicks the Join button, call joinChannel. This method securely connects the local user to a channel using the authentication token. In /app/java/com.example.<projectname>/MainActivity, add the following code after setupLocalVideo:

  8. Leave the channel when user ends the call

    When your app is running, the user can leave or join a channel using the buttons available in the UI. When a user clicks Leave, use leaveChannel to exit the channel. In /app/java/com.example.<projectname>/MainActivity, add leaveChannel after joinChannel:

Start and stop your app

In this implementation, you initiate and destroy Agora Engine when the app opens and closes. The local user joins and leaves a channel using the same Agora Engine instance. In order to send video and audio streams to Agora, you need to ensure that the local user gives permission to access the camera and microphone on the local device.

To elegantly start and stop your app:

  1. Check that the app has the correct permissions and initiate Agora Engine

    In /app/java/com.example.<projectname>/MainActivity, replace onCreate with the following code:

  2. Clean up the resources used by your app

    When the user closes this app, use onDestroy to clean up all the resources you created. In /app/java/com.example.<projectname>/MainActivity, add onDestroy after onCreate:


    _11
    protected void onDestroy() {
    _11
    super.onDestroy();
    _11
    agoraEngine.stopPreview();
    _11
    agoraEngine.leaveChannel();
    _11
    _11
    // Destroy the engine in a sub-thread to avoid congestion
    _11
    new Thread(() -> {
    _11
    RtcEngine.destroy();
    _11
    agoraEngine = null;
    _11
    }).start();
    _11
    }

Test your implementation

Agora recommends you run this project on a physical mobile device, as some simulators may not support the full features of this project. To ensure that you have implemented Interactive Live Streaming in your app:

  1. Generate a temporary token in Agora Console.

  2. In your browser, navigate to the Agora web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. In Android Studio, in app/java/com.example.<projectname>/MainActivity, update appId, channelName and token with the values for your temporary token.

  2. Connect a physical Android device to your development device.

  3. In Android Studio, click Run app. A moment later you see the project installed on your device.

    If this is the first time you run the project, you need to grant microphone and camera access to your app.

Reference

This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.

  • Downloads shows you how to install Video SDK manually.

  • To ensure communication security in a test or production environment, use a token server to generate token is recommended to ensure communication security, see Implement the authentication workflow.

Page Content

Interactive Live Streaming