The AgoraBufferedCamera2.java
class shows how to customize the video source in ByteBuffer
and ByteArray
.
The construct of AgoraBufferedCamera2
needs a Context
and an optional CaptureParameter
that defines the parameters of the camera and buffer type. By default, the resolution is 640 x 480, video format is YUV420P; and data type is ByteBuffer
.
AgoraBufferedCamera2 source = new AgoraBufferedCamera2(this);
source.useFrontCamera(true);
rtcEngine.setVideoSource(source);
When using the ByteArray
or ByteBuffer
video source, the SDK receives the pixel format in YUV420p, NV21, or RGBA.
The AgoraTextureCamera.java
class shows how to customize a textured video source. AgoraTextureCamera
can be used directly as the video source.
IVideoSource source = new AgoraTextureCamera(this, 640, 480);
rtcEngine.setVideoSource(source);
Compared with YUV or RGB, the textured video source is more complex considering the GL environment and thread requirements when being transmitted.
The SurfaceTextureHelper
class is an assisting class provided by the Agora SDK to help users use SurfaceTexture
without the need to build a GL environment, textures, and interact between threads.
Major functions of SurfaceTextureHelper
:
SurfaceTexture
with the textured objectSurfaceTexture
has captured the video frameCreate a SurfaceTextureHelper
public static SurfaceTextureHelper create(final String threadName, final EglBase.Context sharedContext);
Call the create
method to create SurfaceTextureHelper
. In this method, a GL thread is built together with textures and SurfaceTexture
.
Get the SurfaceTexture
public EglBase.Context getEglContext();
public Handler getHandler();
public SurfaceTexture getSurfaceTexture();
This method gets the created texture. If in the GL environment or a thread is required, call the getEglContext
and getHandler
methods.
Monitor the SurfaceTexture
public interface OnTextureFrameAvailableListener {
abstract void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timestampNs);
}
public void startListening(final OnTextureFrameAvailableListener listener);
public void stopListening();
This method creates a listener to monitor the new video frame of SurfaceTexture
, and start or end monitoring by calling the startListening
and stopListening
methods.
Release SurfaceTexture
void dispose();
Call this method to release relevant resources when SurfaceTexture
is no longer needed.
The TextureSource
interface includes SurfaceTextureHelper
and IVideoFrameConsumer
methods to implement customized textured video source operations. SurfaceTexture
can be created by SurfaceTextureHelper
, and used to capture a video frame and convert it into a texture to be sent to RtcEngine
. With the TextureSource
interface, developers only need to care about the following:
SurfaceTexture
video frame.RtcEngine
.Implement four TextureSource
callbacks for the video source functionality and compatibility.
abstract protected boolean onCapturerOpened();
abstract protected boolean onCapturerStarted();
abstract protected void onCapturerStopped();
abstract protected void onCapturerClosed();
Use SurfaceTexture
to capture the video frame.
public SurfaceTexture getSurfaceTexture();
See SurfaceTexture
for methods to capture video frame.
Once the video frame is captured and updated as a texture, call onTextureFrameAvailable
to send the video frame to RtcEngine
.
public void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timestampNs);
Release the resources when the video frame is no longer needed.
public void release();
The following sequence shows an example of how to use external screen recording as the video source.
Step 1. Implement the following callbacks:
public class ScreenRecordSource extends TextureSource {
private Context mContext;
private boolean mIsStart;
private VirtualDisplay mVirtualDisplay;
private MediaProjection mMediaProjection;
public ScreenRecordSource(Context context, int width, int height, int dpi, MediaProjection mediaProjection) {
super(null, width, height);
mContext = context;
mMediaProjection = mediaProjection;
}
@Override
protected boolean onCapturerOpened() {
createVirtualDisplay();
return true;
}
@Override
protected boolean onCapturerStarted() {
return mIsStart = true;
}
@Override
protected void onCapturerStopped() {
mIsStart = false;
}
@Override
protected void onCapturerClosed() {
releaseVirtualDisplay();
}
}
Step 2. Use SurfaceTexture
to create a virtual display for capturing the screen data.
private void createVirtualDisplay() {
Surface inputSurface = new Surface(getSurfaceTexture);
if (mVirtualDisplay == null) {
mVirtualDisplay = mediaProjection.createVirtualDisplay("MainScreen", mWidth, mHeight, mDpi,
DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR, inputSurface, null, null);
}
}
private void virtualDisplay() {
if (virtualDisplay != null) {
virtualDisplay.release();
}
virtualDisplay = null;
}
Step 3. Reimplement the callbacks for getting the video data.
@Override
public void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timeStampNs) {
super.onTextureFrameAvailable(oesTextureId, transformMatrix, timeStampNs);
if (mIsStart && mConsumer != null && mConsumer.get() != null) {
mConsumer.get().consumeTextureFrame(oesTextureId, TEXTURE_OES.intValue(), mWidth, mHeight,
0, System.currentTimeMillis(), transformMatrix);
}
}
Make sure to call the
super.onTextureFrameAvailable(oesTextureId, transformMatrix, timeStampNs)
parent class method.
Step 4. Release the resources when the external video source is no longer needed.
public void sourceRelease() {
releaseProjection();
release();
}
The Agora SDK uses its default renderer to render the local and remote video. The IVideoSink
interface can be used for more advanced functions, such as:
SurfaceView
object or customized view component.AgoraSurfaceView
inherits SurfaceView
and implements the IVideoSink
interface to render video frames in YUV420P, RGB, and Texture (2D/OES).
AgoraSurfaceView render = new AgoraSurfaceView(this);
render.init(MediaIO.BufferType.BYTE_ARRAY, I420, null);
render.setZOrderOnTop(true);
rtcEngine.setLocalVideoRenderer(render);
AgoraTextureView
inherits TextureView
and implements the IVideoSink
interface to render video frames in YUV420P, RGB, and Texture (2D/OES).
The following code shows the usage of AgoraTextureView
with external video sources, and creates the GL environment with TextureSource
:
AgoraTextureCamera source = new AgoraTextureCamera(this, 640, 480);
AgoraTextureView render = (AgoraTextureView) findViewById(R.id.agora_texture_view);
render.init(source.getEglContext());
render.setBufferType(MediaIO.BufferType.TEXTURE);
render.setPixelFormat(MediaIO.PixelFormat.TEXTURE_OES);
rtcEngine().setVideoSource(source);
rtcEngine().setLocalVideoRenderer(render);
Major functions of the BaseVideoRenderer class:
Follow these steps to use the BaseVideoRenderer
class:
Create a customized renderer class to implement the IVideoSink
interface and embed the BaseVideoRenderer
object.
Specify the type and format of the video frame, or call the setBufferType
and setPixelFormat
methods of the embedded BaseVideoRenderer
object.
Share EGLContextHandle
with RtcEngine
, now that BaseVideoRenderer
uses OpenGL as the renderer and has created the EGLContext
.
Set the object to render by calling the setRenderView
and setRenderSurface
methods of the embedded BaseVideoRenderer
object.
Implement methods to control the renderer by calling the onInitialize
, onStart
, onStop
, and onDispose
methods.
Implement IVideoFrameConsumer
, and call the BaseVideoRenderer
object in the corresponding format to render the received video frame on the rendered target.