Screen sharing enables the host of an interactive live streaming broadcast or video call to display what is on their screen to other users in the channel. This technology has many obvious advantages for communicating information, particularly in the following scenarios:
During video conferencing, the speaker can share a local image, web page, or full presentation with other participants.
For online instruction, the teacher can share slides or notes with students.
Some restrictions and cautions exist for using the screen sharing feature, and charges can apply. Agora recommends that you read the following API references before calling the API:
The Agora SDK does not provide any method for screen share on Android. Therefore, You need to implement this function using the native screen-capture APIs provided by Android, and the custom video-source APIs provided by Agora.
Use android.media.projection and android.hardware.display.VirtualDisplay to get and pass the screen-capture data.
Create an OpenGL ES environment. Create a SurfaceView object and pass the object to VirtualDisplay, which works as the recipient of the screen-capture data.
You can get the screen-capture data from the callbacks of SurfaceView. Use either the Push mode or mediaIO mode to push the screen-capture data to the SDK. For details, see Custom Video Source and Renderer.
The code samples provided in this section use MediaProjection and VirtualDisplay APIs provided by Android and have the following Android/API version requirements:
The Android version must be Lollipop or higher.
To use MediaProjection APIs, the Android API level must be 21 or higher.
To use VirtualDisplay APIs, the Android API level must be 19 or higher.
// Sets IVideoFrameConsumer as null when IVideoFrameConsumer is released by the media engine
@Override
publicvoidonDispose(){
Log.e(TAG,"SwitchExternalVideo-onDispose");
mConsumer =null;
}
@Override
publicintgetBufferType(){
return TEXTURE.intValue();
}
@Override
publicintgetCaptureType(){
return CAMERA;
}
@Override
publicintgetContentHint(){
return MediaIO.ContentHint.NONE.intValue();
}
...
}
// Implements IVideoFrameConsumer
privatevolatileIVideoFrameConsumer mConsumer;
Set the custom video source before joining a channel.
// Sets the input thread of the custom video source
// In the sample project, we use the class in the open-source grafika project, which encapsulates the graphics architecture of Android. For details, see https://source.android.com/devices/graphics/architecture
// For detailed implementation of EglCore, GlUtil, EGLContext, and ProgramTextureOES, see https://github.com/google/grafika
// The GLThreadContext class contains EglCore, EGLContext, and ProgramTextureOES
privatevoidprepare(){
// Creates an OpenEL ES environment based on EglCore
// Sets the captured video data as the ScreenShareInput object, and creates an input thread for external video data
setExternalVideoInput(input);
mCurInputType = type;
returntrue;
}
During the initialization of the input thread of the external video data, create a VirtualDisplay object with MediaProjection, and render VirtualDisplay on SurfaceView.
Use SurfaceView as the custom video source. After the user joins the channel, the custom video module gets the screen-capture data using consumeTextureFrame in ExternalVideoInputThread and passes the data to the SDK.
publicvoidrun(){
...
// Calls updateTexImage() to update the data to the texture object of OpenGL ES
// Calls getTransformMatrix() to transform the texture matrix
try{
mSurfaceTexture.updateTexImage();
mSurfaceTexture.getTransformMatrix(mTransform);
}
catch(Exception e){
e.printStackTrace();
}
// Gets the screen-capture data from onFrameAvailable. onFrameAvailable is a rewrite of ScreenShareInput, which gets information such as the texture ID and transform information
// No need to render the screen-capture data on the local view
Currently, the Agora Video SDK for Android supports creating only one RtcEngine instance per app. You need multi-processing to send video from screen sharing and the local camera at the same time.
You need to create separate processes for screen sharing and video captured by the local camera. Both processes use the following methods to send video data to the SDK:
Local-camera process: Implemented with joinChannel. This process is often the main process and communicates with the screen sharing process via AIDL (Android Interface Definition Language). See Android documentation to learn more about AIDL.
Screen sharing process: Implemented with MediaProjection, VirtualDisplay, and custom video capture. The screen sharing process creates an RtcEngine object and uses the object to create and join a channel for screen sharing.
The users of both processes must join the same channel. After joining a channel, a user subscribes to the audio and video streams of all other users in the channel by default, thus incurring all associated usage costs. Because the screen sharing process is only used to publish the screen sharing stream, you can call muteAllRemoteAudioStreams(true) and muteAllRemoteVideoStreams(true) to mute remote audio and video streams.
The following sample code uses multi-processing to send the video from the screen sharing video and locally captured video. AIDL is used to communicate between the processes.
Configure android:process in AndroidManifest.xml for the components.
Create an AIDL interface, which includes methods to communicate between processes.
// Includes methods to manage the screen sharing process
// IScreenSharing.aidl
packageio.agora.rtc.ss.aidl;
importio.agora.rtc.ss.aidl.INotification;
interfaceIScreenSharing{
voidregisterCallback(INotification callback);
voidunregisterCallback(INotification callback);
voidstartShare();
voidstopShare();
voidrenewToken(String token);
}
// Includes callbacks to receive notifications from the screen sharing process
// INotification.aidl
packageio.agora.rtc.ss.aidl;
interfaceINotification{
voidonError(int error);
voidonTokenWillExpire();
}
Implement the screen sharing process. Screen sharing is implemented with MediaProjection, VirtualDisplay, and custom video source. The screen sharing process creates an RtcEngine object and uses the object to create and join a channel for screen sharing.
// Define the ScreenSharingClient object
publicclassScreenSharingClient{
privatestaticfinalString TAG = ScreenSharingClient.class.getSimpleName();