Agora Component for Customization (Android)
Customize the Video Source with the Agora Component
1. AgoraBufferedCamera2 Class
The AgoraBufferedCamera2.java class shows how to customize the video source in ByteBuffer and ByteArray.
The construct of AgoraBufferedCamera2 needs a Context and an optional CaptureParameter that defines the parameters of the camera and buffer type. By default, the resolution is 640 x 480, video format is YUV420P; and data type is ByteBuffer.
When using the ByteArray or ByteBuffer video source, the SDK receives the pixel format in YUV420p, NV21, or RGBA.
2. AgoraTextureCamera Class
The AgoraTextureCamera.java class shows how to customize a textured video source. AgoraTextureCamera can be used directly as the video source.
3. Helper Class and Component
Compared with YUV or RGB, the textured video source is more complex considering the GL environment and thread requirements when being transmitted.
SurfaceTextureHelper Interface
The SurfaceTextureHelper class is an assisting class provided by the Agora SDK to help users use SurfaceTexture without the need to build a GL environment, textures, and interact between threads.
Major functions of SurfaceTextureHelper:
- Create a textured object, and build a
SurfaceTexturewith the textured object - Notify developers on texture updates when
SurfaceTexturehas captured the video frame
Create a SurfaceTextureHelper
Call the create method to create SurfaceTextureHelper. In this method, a GL thread is built together with textures and SurfaceTexture.
Get the SurfaceTexture
This method gets the created texture. If in the GL environment or a thread is required, call the getEglContext and getHandler methods.
Monitor the SurfaceTexture
This method creates a listener to monitor the new video frame of SurfaceTexture, and start or end monitoring by calling the startListening and stopListening methods.
Release SurfaceTexture
Call this method to release relevant resources when SurfaceTexture is no longer needed.
TextureSource Interface
The TextureSource interface includes SurfaceTextureHelper and IVideoFrameConsumer methods to implement customized textured video source operations. SurfaceTexture can be created by SurfaceTextureHelper, and used to capture a video frame and convert it into a texture to be sent to RtcEngine. With the TextureSource interface, developers only need to care about the following:
- Video source functionality and compatibility.
- Capturing the
SurfaceTexturevideo frame. - Sending the updated texture to
RtcEngine.
-
Implement four
TextureSourcecallbacks for the video source functionality and compatibility. -
Use
SurfaceTextureto capture the video frame.See
SurfaceTexturefor methods to capture video frame. -
Once the video frame is captured and updated as a texture, call
onTextureFrameAvailableto send the video frame toRtcEngine. -
Release the resources when the video frame is no longer needed.
4. Example of Using External Screen Recording as the Video Source with TextureSource
The following sequence shows an example of how to use external screen recording as the video source.
Step 1. Implement the following callbacks:
Step 2. Use SurfaceTexture to create a virtual display for capturing the screen data.
Step 3. Reimplement the callbacks for getting the video data.
Make sure to call the
super.onTextureFrameAvailable(oesTextureId, transformMatrix, timeStampNs)parent class method.
Step 4. Release the resources when the external video source is no longer needed.
Customize the Video Sink with the Agora Component
The Agora SDK uses its default renderer to render the local and remote video. The IVideoSink interface can be used for more advanced functions, such as:
- To render the local or remote video frame instead of directly on the view component.
- To use the general
SurfaceViewobject or customized view component. - To render images in specific areas, such as in gaming.
AgoraSurfaceView Class
AgoraSurfaceView inherits SurfaceView and implements the IVideoSink interface to render video frames in YUV420P, RGB, and Texture (2D/OES).
AgoraTextureView Class
AgoraTextureView inherits TextureView and implements the IVideoSink interface to render video frames in YUV420P, RGB, and Texture (2D/OES).
The following code shows the usage of AgoraTextureView with external video sources, and creates the GL environment with TextureSource:
Helper Class and Component
BaseVideoRenderer Class
Major functions of the BaseVideoRenderer class:
- Supports rendering various formats: I420, RGBA, and TEXTURE_2D/OES.
- Supports various rendering targets: SurfaceView, TextureView, Surface, and SurfaceTexture.
Follow these steps to use the BaseVideoRenderer class:
-
Create a customized renderer class to implement the
IVideoSinkinterface and embed theBaseVideoRendererobject. -
Specify the type and format of the video frame, or call the
setBufferTypeandsetPixelFormatmethods of the embeddedBaseVideoRendererobject. -
Share
EGLContextHandlewithRtcEngine, now thatBaseVideoRendereruses OpenGL as the renderer and has created theEGLContext. -
Set the object to render by calling the
setRenderViewandsetRenderSurfacemethods of the embeddedBaseVideoRendererobject. -
Implement methods to control the renderer by calling the
onInitialize,onStart,onStop, andonDisposemethods. -
Implement
IVideoFrameConsumer, and call theBaseVideoRendererobject in the corresponding format to render the received video frame on the rendered target.