Skip to main content

You are looking at Interactive Live Streaming v3.x Docs. The newest version is  Interactive Live Streaming 4.x

Android
iOS
macOS
Windows C++
Windows C#
Unity
Flutter
React Native
Electron
Cocos Creator
Cocos2d-x

Raw Video Data

Introduction

During real-time communications, you can pre- and post-process video data and modify it for desired playback effects.

The Native SDK uses the IVideoFrameObserver class to provide raw data functions. You can pre-process the data before sending it to the encoder and modify the captured video frames. You can also post-process the data after sending it to the decoder and modify the received video frames.

This article describes how to use raw video data with the IVideoFrameObserver class.

Sample project

Refer to the sample project on GitHub to learn how to use raw video data in your project.

Implementation

Before using the raw data functions, ensure that you have implemented the basic real-time communication functions in your project. For details, see Start a Call or Start Live Interactive Streaming.

Follow these steps to implement the raw data functions in your project:

  1. Before joining the channel, call registerVideoFrameObserver to register a video frame observer.
  2. After successful registration, the SDK sends the raw video data via onCaptureVideoFrame, onPreEncodeVideoFrame, or onRenderVideoFrame.
  3. After acquiring the raw data, you can process the data based on your scenario and send the processed data to the SDK via the callbacks mentioned in step 2.

When you use the SDK v3.0.1 or later, note that the SDK no longer guarantees that callback functions in IVideoFrameObserver are reported in the same thread. The SDK only guarantees the sequence of these callbacks. If you are using OpenGL to perform image enhancement on the raw video data, you need to actively switch the OpenGL context in the callback function in IVideoFrameObserver to adapt to a multi-threaded scenario; otherwise, the image enhancement cannot work.

API call sequence

The following diagram shows how to implement the raw data functions in your project:

img

Data transfer

The following diagram shows the data transfer with the IVideoFrameObserver class:

1606288025895

With onCaptureVideoFrame, onPreEncodeVideoFrame, and onRenderVideoFrame, you can:

  • Get raw video frames from VideoFrame.
  • Process the raw frames from VideoFrame and return to the SDK or your custom renderer.

Sample code

  1. Call registerVideoFrameObserver to register a video frame observer.
// Register or unregister the video frame observer
BOOL CAgoraOriginalVideoDlg::RegisterVideoFrameObserver(BOOL bEnable,IVideoFrameObserver * videoFrameObserver)
{
// Create an AutoPtr instance using the IMediaEngine as template
agora::util::AutoPtr<agora::media::IMediaEngine> mediaEngine;
// The AutoPtr instance calls queryInterface and gets a pointer to the IMediaEngine instance via IID.
// The AutoPtr instance accesses the pointer to the IMediaEngine instance and calls registerVideoFrameObserver via IMediaEngine.
mediaEngine.queryInterface(m_rtcEngine, agora::AGORA_IID_MEDIA_ENGINE);
int nRet = 0;
if (mediaEngine.get() == NULL)
return FALSE;
if (bEnable) {
// Register the video frame observer
nRet = mediaEngine->registerVideoFrameObserver(videoFrameObserver);
}
else {
// Unregister the video frame observer
nRet = mediaEngine->registerVideoFrameObserver(NULL);
}
return nRet == 0 ? TRUE : FALSE;
}
Copy
  1. Once you obtain the raw video data, you can pre-process or post-process it.

    Get the raw video data captured by the camera:

// Get the raw video data captured by the camera via onCaptureVideoFrame, process the data with a grayscale filter, and return the data to the SDK
bool CGrayVideoProcFrameObserver::onCaptureVideoFrame(VideoFrame & videoFrame)
{
int nSize = videoFrame.height * videoFrame.width;
memset(videoFrame.uBuffer, 128, nSize / 4);
memset(videoFrame.vBuffer, 128, nSize / 4);
return true;
}



// Get the raw video data captured by the camera via onCaptureVideoFrame and process with an average filter, and return the data to the SDK
bool CAverageFilterVideoProcFrameObserver::onCaptureVideoFrame(VideoFrame & videoFrame)
{
static int step = 1;
static bool flag = true;
if (flag)
{
step += 2;
}
else {
step -= 2;
}
if (step >= 151)
{
flag = false;
step -= 4;
}
else if (step <= 0) {
flag = true;
step += 4;
}
AverageFiltering((unsigned char *)videoFrame.yBuffer, videoFrame.width, videoFrame.height, step);
AverageFiltering((unsigned char *)videoFrame.uBuffer, videoFrame.width / 2, videoFrame.height / 2, step);
AverageFiltering((unsigned char *)videoFrame.vBuffer, videoFrame.width / 2, videoFrame.height / 2, step);
return true;
}
Copy

Get the raw video data from the remote user:

// Get the raw video data from the remote user via onRenderVideoFrame
bool CGrayVideoProcFrameObserver::onRenderVideoFrame(unsigned int uid, VideoFrame & videoFrame)
{
return true;
}
Copy

API reference

Interactive Live Streaming