一、獲得遠端裸數據
1、獲得h264數據
1)、遠端編碼后視頻數據監測器
/*** @locale zh* @type callback* @region 視頻管理* @brief 遠端編碼后視頻數據監測器<br>* 注意:回調函數是在 SDK 內部線程(非 UI 線程)同步拋出來的,請不要做耗時操作或直接操作 UI,否則可能導致 app 崩潰。*/
/*** @locale en* @type callback* @region video management* @brief Remote encoded video data monitor<br>* Note: Callback functions are thrown synchronously in a non-UI thread within the SDK. Therefore, you must not perform any time-consuming operations or direct UI operations within the callback function, as this may cause the app to crash.*/
class IRemoteEncodedVideoFrameObserver {
public:/*** @locale zh* @hidden constructor/destructor* @brief 析構函數*//*** @locale en* @hidden constructor/destructor* @brief Destructor*/virtual ~IRemoteEncodedVideoFrameObserver() {}/*** @locale zh* @type callback* @region 視頻數據回調* @brief 調用 registerRemoteEncodedVideoFrameObserver{@link #IRTCVideo#registerRemoteEncodedVideoFrameObserver} 后,SDK 監測到遠端編碼后視頻數據時,觸發該回調* @param stream_info 收到的遠端流信息,參看 RemoteStreamKey{@link #RemoteStreamKey}* @param video_stream 收到的遠端視頻幀信息,參看 IEncodedVideoFrame{@link #IEncodedVideoFrame}*//*** @locale en* @type callback* @region video data callback* @brief Call registerRemoteEncodedVideoFrameObserver{@link #IRTCVideo#registerRemoteEncodedVideoFrameObserver}, the callback is triggered when the SDK detects the remote encoded video data* @param stream_info The received remote stream information. See RemoteStreamKey{@link #RemoteStreamKey}* @param video_stream The received remote video frame information. See IEncodedVideoFrame{@link #IEncodedVideoFrame}*/virtual void onRemoteEncodedVideoFrame(const RemoteStreamKey& stream_info, const IEncodedVideoFrame& video_stream) = 0;
};
2)、IRemoteEncodedVideoFrameObserver 派生
class ByteRTCEventHandler : public QObject,public bytertc::IRTCVideoEventHandler,public bytertc::IAudioEffectPlayerEventHandler,public bytertc::IMixedStreamObserver,public bytertc::IMediaPlayerEventHandler,public bytertc::IRemoteEncodedVideoFrameObserver,public bytertc::IVideoSink
virtual void onRemoteEncodedVideoFrame(const bytertc::RemoteStreamKey& stream_info, const bytertc::IEncodedVideoFrame& video_stream) override;
void ByteRTCEventHandler::onRemoteEncodedVideoFrame(const bytertc::RemoteStreamKey& stream_info, const bytertc::IEncodedVideoFrame& video_stream) {}
std::unique_ptr<ByteRTCEventHandler> m_handler;
3)、registerRemoteEncodedVideoFrameObserver
/*** @locale zh* @type api* @region 視頻管理* @brief 注冊遠端編碼后視頻數據回調。 <br>* 完成注冊后,當 SDK 監測到遠端編碼后視頻幀時,會觸發 onRemoteEncodedVideoFrame{@link #IRemoteEncodedVideoFrameObserver#onRemoteEncodedVideoFrame} 回調* @param observer 遠端編碼后視頻數據監測器,參看 IRemoteEncodedVideoFrameObserver{@link #IRemoteEncodedVideoFrameObserver}* @return * + 0: 調用成功。<br>* + < 0 : 調用失敗。查看 ReturnStatus{@link #ReturnStatus} 獲得更多錯誤說明* @note * + 更多自定義解碼功能說明參看 [自定義視頻編解碼](https://www.volcengine.com/docs/6348/82921#%E8%87%AA%E5%AE%9A%E4%B9%89%E8%A7%86%E9%A2%91%E8%A7%A3%E7%A0%81)。<br>* + 該方法適用于手動訂閱,并且進房前后均可調用,建議在進房前調用。 <br>* + 引擎銷毀前需取消注冊,調用該方法將參數設置為 nullptr 即可。*//*** @locale en* @type api* @region video management* @brief Video data callback after registering remote encoding. <br>* After registration, when the SDK detects a remote encoded video frame, it will trigger the onRemoteEncodedVideoFrame{@link #IRemoteEncodedVideoFrameObserver#onRemoteEncodedVideoFrame} callback* @param observer Remote encoded video data monitor. See IRemoteEncodedVideoFrameObserver{@link #IRemoteEncodedVideoFrameObserver}* @return * + 0: Success.<br>* + < 0 : Fail. See ReturnStatus{@link #ReturnStatus} for more details* @note * + See [Custom Video Encoding and Decoding](https://docs.byteplus.com/byteplus-rtc/docs/82921#custom-video-decoding) for more details about custom video decoding. <br>* + This method applys to manual subscription mode and can be called either before or after entering the Room. It is recommended to call it before entering the room. <br>* + The engine needs to be unregistered before it is destroyed. Call this method to set the parameter to nullptr.*/virtual int registerRemoteEncodedVideoFrameObserver(IRemoteEncodedVideoFrameObserver* observer) = 0;
m_video->registerRemoteEncodedVideoFrameObserver(m_handler.get());
2、自定義視頻渲染器
0)、IVideoSink?
/*** @locale zh* @type keytype* @brief 自定義視頻渲染器*/
/*** @locale en* @type keytype* @brief Custom video renderer*/
class IVideoSink {
public:/*** @locale zh* @type keytype* @brief 視頻幀編碼格式*//*** @locale en* @type keytype* @brief Video frame encoding format*/enum PixelFormat {/*** @locale zh* @brief YUV I420 格式*//*** @locale en* @brief YUV I420 format*/kI420 = VideoPixelFormat::kVideoPixelFormatI420,/*** @locale zh* @brief RGBA 格式, 字節序為 R8 G8 B8 A8*//*** @locale en* @brief RGBA format*/kRGBA = VideoPixelFormat::kVideoPixelFormatRGBA,/*** @locale zh* @brief 原始視頻幀格式*//*** @locale en* @brief Original format*/kOriginal = VideoPixelFormat::kVideoPixelFormatUnknown,};/*** @locale zh* @type callback* @brief 視頻幀回調* @param [out] video_frame 視頻幀結構類,參看 IVideoFrame{@link #IVideoFrame}* @return 返回值暫未使用*//*** @locale en* @type callback* @brief Video frame callback* @param [out] video_frame Video frame structure. See IVideoFrame{@link #IVideoFrame}.* @return Temporarily unavailable*/virtual bool onFrame(IVideoFrame* video_frame) = 0;/*** @locale zh* @type callback* @region 房間管理* @brief 獲取外部渲染耗時。* @note 獲取外部渲染耗時進行上報。開發者需要自己計算平均渲染耗時。*//*** @locale en* @type callback* @region Room Management* @brief Gets the time taken in custom rendering.* @note Gets the time taken in custom rendering and report. You need to calculate the average rendering time by yourself.*/virtual int getRenderElapse() = 0;/*** @locale zh* @type callback* @brief 釋放渲染器。* @note 通知開發者渲染器即將被廢棄。收到該返回通知后即可釋放資源。*//*** @locale en* @type callback* @brief Releases the renderer.* @note Used to notify the user that the renderer is about to be deprecated. Resources can be released upon receipt of this notification.*/virtual void release() {}/*** @locale zh* @hidden constructor/destructor* @brief 析構函數*//*** @locale en* @hidden constructor/destructor* @brief Destructor*/virtual ~IVideoSink() = default;/*** @locale zh* @hidden sink id* @brief sink id*//*** @locale en* @hidden sink id* @brief sink id*/virtual void* uniqueId() const { return (void *)this; }
};
1)、setRemoteVideoSink
/*** @locale zh* @type api* @deprecated since 3.57, use setRemoteVideoRender{@link #IRTCVideo#setRemoteVideoRender} instead.* @region 自定義視頻采集渲染* @brief 將遠端視頻流與自定義渲染器綁定。* @param stream_key 遠端流信息,用于指定需要渲染的視頻流來源及屬性,參看 RemoteStreamKey{@link #RemoteStreamKey}。* @param video_sink 自定義視頻渲染器,參看 IVideoSink{@link #IVideoSink}。* @param required_format video_sink 適用的視頻幀編碼格式,參看 PixelFormat{@link #PixelFormat}。* @return * + 0: 調用成功。<br>* + < 0 : 調用失敗。查看 ReturnStatus{@link #ReturnStatus} 獲得更多錯誤說明* @note * + RTC SDK 默認使用 RTC SDK 自帶的渲染器(內部渲染器)進行視頻渲染。<br>* + 該方法進房前后均可以調用。若想在進房前調用,你需要在加入房間前獲取遠端流信息;若無法預先獲取遠端流信息,你可以在加入房間并通過 onUserPublishStream{@link #IRTCRoomEventHandler#onUserPublishStream} 回調獲取到遠端流信息之后,再調用該方法。<br>* + 如果需要解除綁定,必須將 video_sink 設置為 null。退房時將清除綁定狀態。<br>* + 本方法獲取的是后處理后的視頻幀,如需獲取其他位置的視頻幀(如解碼后的視頻幀),請調用 setRemoteVideoRender{@link #IRTCVideo#setRemoteVideoRender}。*//*** @locale en* @type api* @region Custom Video Capturing & Rendering* @brief Binds the remote video stream to a custom renderer.* @param stream_key Remote stream information which specifys the source and type of the video stream to be rendered. See RemoteStreamKey{@link #RemoteStreamKey}.* @param video_sink Custom video renderer. See IVideoSink{@link #IVideoSink}.* @param required_format Encoding format which applys to the custom renderer. See PixelFormat{@link #PixelFormat}.* @return * + 0: Success.<br>* + < 0 : Fail. See ReturnStatus{@link #ReturnStatus} for more details* @note * + RTC SDK uses its own renderer (internal renderer) for video rendering by default. <br>* + Joining or leaving the room will not affect the binding state. <br>* + This API can be called before and after entering the room. To call before entering the room, you need to get the remote stream information before joining the room; if you cannot get the remote stream information in advance, you can call the API after joining the room and getting the remote stream information via onUserPublishStream{@link #IRTCRoomEventHandler#onUserPublishStream}.<br>* + If you need to unbind, you must set videoSink to null.*/virtual int setRemoteVideoSink(RemoteStreamKey stream_key, IVideoSink* video_sink, IVideoSink::PixelFormat required_format) = 0;
2)、遠端用戶發布流時,設置渲染方式
注意:設置registerRemoteEncodedVideoFrameObserver 后,setRemoteVideoSink 不再起作用了
//遠端用戶發流
void QuickStartWidget::onSigUserPublishStream(std::string roomid, std::string uid, bytertc::MediaStreamType type)
{QString log_str = QString("onUserPublishStream,roomid:")+ QString::fromStdString(roomid)+ ",uid:" + QString::fromStdString(uid)+ ",type:" + QString::number(type);appendCallback(log_str);if (!m_remote_rendered) {if (0) {bytertc::VideoCanvas cas;bytertc::RemoteStreamKey key;key.room_id = roomid.c_str();key.user_id = uid.c_str();key.stream_index = bytertc::kStreamIndexMain;cas.background_color = 0;cas.render_mode = bytertc::kRenderModeHidden;cas.view = nullptr;m_video->setRemoteVideoCanvas(key, cas);cas.view = (void*)ui->widget_remote->getWinId();m_video->setRemoteVideoCanvas(key, cas);ui->widget_remote->setUserInfo(roomid, uid);m_remote_rendered = true;}else {bytertc::RemoteStreamKey key;key.room_id = roomid.c_str();key.user_id = uid.c_str();key.stream_index = bytertc::kStreamIndexMain;// m_video->setRemoteVideoSink(key, m_handler.get(), bytertc::IVideoSink::PixelFormat::kRGBA);m_video->setRemoteVideoSink(key, m_handler.get(), bytertc::IVideoSink::PixelFormat::kRGBA);m_remote_rendered = true;}}
}
3)、獲得遠端裸數據
bool ByteRTCEventHandler::onFrame(bytertc::IVideoFrame* video_frame) {bytertc::VideoFrameType type= video_frame->frameType();bytertc::VideoPixelFormat format=video_frame->pixelFormat();bytertc::VideoContentType contentType= video_frame->videoContentType();int width = video_frame->width();int height= video_frame->height();bytertc::VideoRotation rotation = video_frame->rotation();bytertc::ColorSpace space = video_frame->colorSpace();int numPlans = video_frame->numberOfPlanes();uint8_t* data = video_frame->getPlaneData(numPlans-1);SaveRGBAToPNG(data, width, height, "output.png");return true;
}
測試
#define STB_IMAGE_WRITE_IMPLEMENTATION
#include "stb_image_write.h"void SaveRGBAToPNG(uint8_t* rgbaData, int width, int height, const std::string& filePath) {// 第4個參數是每像素通道數,這里RGBA是4// 每行像素的跨度是 width * 4 字節stbi_write_png(filePath.c_str(), width, height, 4, rgbaData, width * 4);
}