Introduction

By default, an app uses the internal audio and video modules for capturing and rendering during real-time communication. You can use an external audio or video source and renderer. This page shows how to use the methods provided by Agora SDK to customize the audio and video source and renderer.

Customizing the audio and video source and renderer mainly applies to the following scenarios:

  • When the audio or video source captured by the internal modules do not meet your needs. For example, you need to process the captured video frame with a preprocessing library for image enhancement.
  • When an app has its own audio or video module and uses a customized source for code reuse.
  • When you want to use a non-camera source, such as recorded screen data.
  • When you need flexible device resource allocation to avoid conflicts with other services.

Implementation

Ensure that you prepared the development environment. See Integrate the SDK.

Customize the Audio Source

Use the push method to customize the audio source, where the SDK conducts no data processing to the audio frame, such as noise reduction.

// java
// Enable the external audio source mode.
rtcEngine.setExternalAudioSource(
    true,      // Enable the external audio source.
    44100,     // Sampling rate. Set it as 8k, 16k, 32k, 44.1k, or 48kHz.
    1          // The number of external audio channels. The maximum value is 2.
);

// To continually push the enternal audio frame.
rtcEngine.pushExternalAudioFrame(
    data,             // The audio data in the format of byte[].
    timestamp         // The timestamp of the audio frame.
);

API Method

Customize the Video Source

The Agora SDK provides two methods to customize the video source:

  • MediaIO method (Recommended).
  • Push method. This method skips processing the video frame and works best for clients with frame optimization capacity.

MediaIO Method

Use the IVideoSource interface in MediaIO to customize the video source. This method sends the external video frame to the server, and you need to implement local rendering if the local preview is enabled.

// java
IVideoFrameConsumer mConsumer;
boolean mHasStarted;

// Create a VideoSource instance.
VideoSource source = new VideoSource() {
    @Override
    public int getBufferType() {
        // Get the current frame type. 
        // The SDK uses different methods to process different frame types.
        // If you want to switch to another VideoSource type, create another instance.
        // There are three video frame types.
        return BufferType.BYTE_ARRAY;
        // return BufferType.TEXTURE
        // return BufferType.BYTE_BUFFER;
    }

    @Override
     public boolean onInitialize(IVideoFrameConsumer consumer) {
        // Consumer was created by the SDK.
        // Save it in the lifecycle of the VideoSource.
        mConsumer = consumer;
    }

    @Override
     public boolean onStart() {
        mHasStarted = true;
    }

    @Override
      public void onStop() {
        mHasStarted = false;
    }

    @Override
     public void onDispose() {
        // Release the consumer
        mConsumer = null;
    }
};

// Change the inputting video stream to the VideoSource instance.
rtcEngine.setVideoSource(source);

// After receiving the video frame data, use the consumer class to send the data.
// Choose differnet methods according to the frame type.
// Suppose the current frame type is byte array, i.e. NV21.
if (mHasStarted && mConsumer != null) {
    mConsumer.consumeByteArrayFrame(data, AgoraVideoFrame.NV21, width, height, rotation, timestamp);
}
API Methods

Push Method

Compared to the MediaIO method, the push method uses less code but lacks any optimization of the captured video frame. This method requires you to do the processing.

// java
// Notify the SDK that the external video source is used.
rtcEngine.setExternalVideoSource(
    true,      // Whether to use an external video source.
    false,      // Whether to use texture as the output format.
    true        // Whether to use the push mode. True means yes. False means to use the pull mode, which is not supported.
    );

// Use the push method to send the video frame data once it is received.
rtcEngine.pushExternalVideoFrame(new AgoraVideoFrame(
    // Pass parameters of the frame (such as format and width and height) in the AgoraVideoFrame construct.
));
API Method

The video source and renderer can be customized by switching on/off the video frame input based on the media engine callback. This method can be used if an app has its own video module and only needs the Agora SDK for real-time communications. See Customize the Video Source with the Agora Component.

Customize the Video Renderer

Use the IVideoSink Interface of MediaIO to customize the video renderer.

// java
IVideoSink sink = new IVideoSink() {
    @Override
    public boolean onInitialize () {
        return true;
    }

    @Override
    public boolean onStart() {
        return true;
    }

    @Override
    public void onStop() {

    }

    @Override
    public void onDispose() {

    }

    @Override
    public long getEGLContextHandle() {
        // Create your egl context.
        // A return value of 0 means no egl context is created in the renderer.
        return 0;
    }

    @Override
    public int getBufferType() {
        return BufferType.BYTE_ARRAY;
    }

    @Override
    public int getPixelFormat() {
        return PixelFormat.NV21;
    }
}

rtcEngine.setLocalVideoRenderer(sink);

API Methods

Agora provides a helper class and sample code for developers to customize the video renderer. See Customize the Video Renderer with Agora Components.

Agora provides a sample app for customizing the video source and video sink. See Agora Custom Media Device.

Consideration

Customizing the audio/video source and renderer is an advanced feature provided by Agora SDK. Ensure that you are experienced in audio and video application development.