前言

        无意中发现,OpenCV也可以运行gstreamer的命令管道,然后使用appsink来与OpenCV连接起来进行处理,在不断测试之下,先后实现了以下功能:

        1. OpenCV运行gstreamer命令,通过appsink传递给OpenCV显示

        2. OpenCV运行gstreamer命令,然后再把Mat图像数据通过appsrc传递给gstreamer显示

        3. 增加OpenCV处理,然后使用gstreamer的overlay绑定QT的QWidget显示出来

一、环境安装以及简单的测试demo

注意:

  1. gstreamer和opencv的版本一定要匹配才行,比如目前使用的是gstreamer--1.16.3 ,opencv是4.2.0.  ,我没仔细去查具体的版本匹配,但是OpenCV3.4和gstreamer1.16肯定不匹配。

  2. 虚拟机的路径是/usr/lib/x86_64-linux-gnu/ orin上是/usr/local/lib,在pro文件需要区分

安装OpenCV环境:

方法一:源码安装opencv参考:Jetson Orin NX 开发指南(5): 安装 OpenCV 4.6.0 并配置 CUDA 以支持 GPU 加速_jetson安装opencv-CSDN博客

方法二:apt安装opencv开发库:sudo apt-get update sudo apt-get install libopencv-dev

我自己使用的是方法二,但是这个需要自己找到头文件、库那些位置。

安装gstreamer环境,请参考其他文章,或者以下:

sudo apt-get install -y libgstreamer1.0-0 \
            gstreamer1.0-plugins-base \
            gstreamer1.0-plugins-good \
            gstreamer1.0-plugins-bad \
            gstreamer1.0-plugins-ugly \
            gstreamer1.0-libav \
            gstreamer1.0-doc \
            gstreamer1.0-tools \
            libgstreamer1.0-dev \
            libgstreamer-plugins-base1.0-dev

sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libqt5gstreamer-dev libgtk-3-dev

sudo apt-get install libpng12-0 

源码功能:实现OpenCV运行gstreamer管道

pro文件:

QT       += core gui 

greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
DESTDIR = $$PWD/bin
CONFIG += c++11
    
DEFINES += QT_DEPRECATED_WARNINGS

SOURCES += \
    main.cpp
 
#orin: 
CONFIG += link_pkgconfig
PKGCONFIG += opencv4
LIBS    += -L/usr/local/lib -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs

INCLUDEPATH += \
		/usr/include/opencv2/

#虚拟机:
#CONFIG      += link_pkgconfig
#PKGCONFIG   += opencv4
#LIBS        += -L/usr/lib/x86_64-linux-gnu/ -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio
#INCLUDEPATH += /usr/include/opencv4/opencv2

main.cpp文件:

注意:如果没有相机,可以把下面的 v4l2src device=/dev/video0 换成 videotestsrc

#include <opencv2/opencv.hpp>

int main() {
    // GStreamer pipeline字符串
    std::string pipeline = "v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink  sync=0 drop=1";

    // 使用 GStreamer 捕获视频流
    cv::VideoCapture cap(pipeline, cv::CAP_GSTREAMER);

    // 检查捕获是否成功打开
    if (!cap.isOpened()) {
        std::cout << "Error opening video stream" << std::endl;
        return -1;
    }

    // 读取并显示视频流
    cv::Mat frame;
    while (true) {
        cap >> frame;  // 从视频流中读取帧
        if (frame.empty()) break;  // 如果帧为空,则退出循环
        cv::imshow("Video", frame);  // 显示帧
        if (cv::waitKey(1) == 27) break;  // 按下 ESC 键退出循环
    }

    // 释放 VideoCapture 对象
    cap.release();
    cv::destroyAllWindows();

    return 0;
}

如果可以实现播放视频,就说明成功。

二、OpenCV运行gstreamer命令,通过appsink传递给OpenCV显示,然后通过appsrc传递回gstreamer,通过overlay绑定QWidget

pro文件: (相关的库,需要按照自己的配)

QT       += core gui # multimediawidgets

greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
DESTDIR = $$PWD/bin
CONFIG += c++11


DEFINES += QT_DEPRECATED_WARNINGS

SOURCES += \
    main.cpp


# Default rules for deployment.
qnx: target.path = /tmp/$${TARGET}/bin
else: unix:!android: target.path = /opt/$${TARGET}/bin
!isEmpty(target.path): INSTALLS += target


CONFIG      += link_pkgconfig
PKGCONFIG   += opencv4
LIBS        += -L/usr/lib/x86_64-linux-gnu/ -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio
INCLUDEPATH += /usr/include/opencv4/opencv2
#*************************************************

CONFIG += link_pkgconfig
# 虚拟机环境:gstreamer************************
PKGCONFIG += gstreamer-1.0 gstreamer-plugins-base-1.0  opencv4  #gtk+-3.0
LIBS    += -lX11
LIBS    +=-lglib-2.0
LIBS    +=-lgobject-2.0
LIBS    +=-lgstreamer-1.0          # <gst/gst.h>
LIBS    +=-lgstvideo-1.0             # <gst/video/videooverlay.h>
LIBS    +=-L/usr/lib/x86_64-linux-gnu/gstreamer-1.0
LIBS    +=-lgstautodetect
LIBS    +=-lgstaudio-1.0
LIBS    +=-lgstapp-1.0
LIBS    += -L/usr/local/lib/ -lgstrtspserver-1.0
LIBS    += -ltesseract

INCLUDEPATH += \
            /usr/include/glib-2.0 \
            /usr/lib/x86_64-linux-gnu/glib-2.0/include \
            /usr/include/gstreamer-1.0 \
            /usr/lib/x86_64-linux-gnu/gstreamer-1.0/include
# ************************************

main.cpp:

实现思路:

OpenCV运行gstreamer管道获取相机数据,处理以后,再交给gstreamer显示(QT实现)-LMLPHP

如上图所示,以下是部分源码:

1. OpenCV运行gstreamer命令管道:

// 开始捕获视频并发送到 appsrc   v4l2src device=/dev/video0
    cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
    cv::Mat orinFrame;

    gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);

2. OpenCV显示接收到的图片,并且发送数据到appsrc,以供下一个gstreamer接收:

    while (capture.isOpened()) {

        capture.read(orinFrame);

        if (orinFrame.empty()) {
            break;
        }

        cv::imshow("Video", orinFrame);  // 显示原始帧

        // 其他处理:
        cv::Mat frame;
        frame = orinFrame;
        //cv::bitwise_not(orinFrame, frame);      // 反色
        //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0);  //高斯模糊


        // 创建 GStreamer 缓冲区
        GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
        GstMapInfo info;

        gst_buffer_map(buffer, &info, GST_MAP_WRITE);
        memcpy(info.data, frame.data, frame.total() * frame.elemSize());
        gst_buffer_unmap(buffer, &info);

        // 发送缓冲区到 appsrc
        GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
        if (ret != GST_FLOW_OK) {
            g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
        }

        // 延时一段时间,模拟视频流
        cv::waitKey(30);
    }

3. 创建接收用的gstreamer管道,然后交给QWidget显示

    这里创建管道,有两种方式:

        1. 第一种是使用gst_parse_launch,然后通过gst_bin_get_by_name查找获取元素指针。

        2. 第二种是使用各种GstElement组合为管道。

    第一种:

// 使用gst_parse_launch
    // 创建 GStreamer 管道
    std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";

    GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
    // 获取 appsrc 元素
    GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
    GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");

    第二种:

    // 组合:
    GstElement* pipeline = gst_pipeline_new("test-pipeline");

    // 设置视频参数
    int width = 640;
    int height = 480;
    int fps = 30;

    // 创建 appsrc 元素
    GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
    // 创建 sink 元素
    GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
    GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink");    //glimagesink  ximagesink  xvimagesink
    GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
    GstElement* converter = gst_element_factory_make("videoconvert", "converter");
    g_object_set(G_OBJECT(appsrc), "caps",
                 gst_caps_new_simple("video/x-raw",
                                     "format", G_TYPE_STRING, "BGR",
                                     "width", G_TYPE_INT, width,
                                     "height", G_TYPE_INT, height,
                                     "framerate", GST_TYPE_FRACTION, fps, 1,
                                     NULL),
                 "is-live", TRUE,
                 "format", GST_FORMAT_TIME,
                 NULL);

    // 连接 appsrc 元素和 glimagesink 元素
    gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
    if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
        g_printerr("Failed to link appsrc and fpssink!\n");
        gst_object_unref(pipeline);
        return -1;
    }

4. 链接QT界面,并且启动管道:

    // 链接QT界面
    gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);

    // 启动管道
    GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (state_ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr("Failed to start pipeline!\n");
        gst_object_unref(pipeline);
        return -1;
    }

注意事项:

  1. 接收用的gstreamer管道,其实不用QWidget链接也可,链接了只是为了方便以后扩展添加各种控件的,比如按钮和标签那些。
  2. 接收用的gstreamer管道,caps一定是给appsrc设置的。
  3. 接收用的gstreamer管道,fpsdisplaysink、timeoverlay等很多插件,都无法正常使用,目前不知道是哪里的问题。

完整代码:


#include <QApplication>
#include <QMainWindow>
#include <opencv2/opencv.hpp>
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <gst/video/videooverlay.h>

int main(int argc, char *argv[]) {
    QApplication app(argc, argv);

    // 初始化 GStreamer
    GMainLoop *loop;
    // 设置 GStreamer 调试环境变量
    //g_setenv("GST_DEBUG", "4", TRUE);
    gst_init(&argc, &argv);
    loop = g_main_loop_new (NULL, FALSE);
    // 创建窗口
    QWidget *window = new QWidget();
    window->resize(640, 480);
    window->show();
    WId xwinid = window->winId();



#if 1
    // 使用gst_parse_launch
    // 创建 GStreamer 管道
    std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";

    GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
    // 获取 appsrc 元素
    GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
    GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");
    //GstElement* vsink = gst_element_factory_make("glimagesink", "vsink");
    //g_object_set (vsink2, "video-sink", vsink, NULL);
#else
    // 组合:
    GstElement* pipeline = gst_pipeline_new("test-pipeline");

    // 设置视频参数
    int width = 640;
    int height = 480;
    int fps = 30;

    // 创建 appsrc 元素
    GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
    // 创建 sink 元素
    GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
    GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink");    //glimagesink  ximagesink  xvimagesink
    GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
    GstElement* converter = gst_element_factory_make("videoconvert", "converter");
    g_object_set(G_OBJECT(appsrc), "caps",
                 gst_caps_new_simple("video/x-raw",
                                     "format", G_TYPE_STRING, "BGR",
                                     "width", G_TYPE_INT, width,
                                     "height", G_TYPE_INT, height,
                                     "framerate", GST_TYPE_FRACTION, fps, 1,
                                     NULL),
                 "is-live", TRUE,
                 "format", GST_FORMAT_TIME,
                 NULL);

    // 连接 appsrc 元素和 glimagesink 元素
    gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
    if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
        g_printerr("Failed to link appsrc and fpssink!\n");
        gst_object_unref(pipeline);
        return -1;
    }

    #endif

    // 链接QT界面
    gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);

    // 启动管道
    GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (state_ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr("Failed to start pipeline!\n");
        gst_object_unref(pipeline);
        return -1;
    }

    // 开始捕获视频并发送到 appsrc   v4l2src device=/dev/video0
    cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
    cv::Mat orinFrame;

    gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);

    while (capture.isOpened()) {

        capture.read(orinFrame);

        if (orinFrame.empty()) {
            break;
        }

        cv::imshow("Video", orinFrame);  // 显示原始帧

        // 其他处理:
        cv::Mat frame;
        frame = orinFrame;
        //cv::bitwise_not(orinFrame, frame);      // 反色
        //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0);  //高斯模糊


        // 创建 GStreamer 缓冲区
        GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
        GstMapInfo info;

        gst_buffer_map(buffer, &info, GST_MAP_WRITE);
        memcpy(info.data, frame.data, frame.total() * frame.elemSize());
        gst_buffer_unmap(buffer, &info);

        // 发送缓冲区到 appsrc
        GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
        if (ret != GST_FLOW_OK) {
            g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
        }

        // 延时一段时间,模拟视频流
        cv::waitKey(30);
    }

    // 停止视频捕获
    capture.release();

    // 启动管道
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    // 主循环
    app.exec();

    // 停止管道
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);

    return 0;
}

效果:

OpenCV运行gstreamer管道获取相机数据,处理以后,再交给gstreamer显示(QT实现)-LMLPHP

三、其他:灰度化、高斯模糊、人脸识别

#include <QApplication>
#include <QMainWindow>
#include <opencv2/opencv.hpp>
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <gst/video/videooverlay.h>

#include <opencv2/core.hpp>
#include <opencv2/dnn.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/highgui.hpp>
using namespace std;
using namespace cv;
using namespace cv::dnn;
int main(int argc, char *argv[]) {
    QApplication app(argc, argv);

    // 初始化 GStreamer
    GMainLoop *loop;
    // 设置 GStreamer 调试环境变量
    //g_setenv("GST_DEBUG", "4", TRUE);
    gst_init(&argc, &argv);
    loop = g_main_loop_new (NULL, FALSE);
    // 创建窗口
    QWidget *window = new QWidget();
    window->resize(640, 480);
    window->show();
    WId xwinid = window->winId();



#if 1
    // 使用gst_parse_launch
    // 创建 GStreamer 管道
    std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";

    GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
    // 获取 appsrc 元素
    GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
    GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");
    //GstElement* vsink = gst_element_factory_make("glimagesink", "vsink");
    //g_object_set (vsink2, "video-sink", vsink, NULL);
#else
    // 组合:
    GstElement* pipeline = gst_pipeline_new("test-pipeline");

    // 设置视频参数
    int width = 640;
    int height = 480;
    int fps = 30;

    // 创建 appsrc 元素
    GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
    // 创建 sink 元素
    GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
    GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink");    //glimagesink  ximagesink  xvimagesink
    GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
    GstElement* converter = gst_element_factory_make("videoconvert", "converter");
    g_object_set(G_OBJECT(appsrc), "caps",
                 gst_caps_new_simple("video/x-raw",
                                     "format", G_TYPE_STRING, "BGR",
                                     "width", G_TYPE_INT, width,
                                     "height", G_TYPE_INT, height,
                                     "framerate", GST_TYPE_FRACTION, fps, 1,
                                     NULL),
                 "is-live", TRUE,
                 "format", GST_FORMAT_TIME,
                 NULL);

    // 连接 appsrc 元素和 glimagesink 元素
    gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
    if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
        g_printerr("Failed to link appsrc and fpssink!\n");
        gst_object_unref(pipeline);
        return -1;
    }

    #endif

    // 链接QT界面
    gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);

    // 启动管道
    GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (state_ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr("Failed to start pipeline!\n");
        gst_object_unref(pipeline);
        return -1;
    }

    // 开始捕获视频并发送到 appsrc   v4l2src device=/dev/video0
    cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
    cv::Mat orinFrame;

    gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);

    // 加载CNN模型
    //Net net = readNetFromTensorflow("tensorflow_inception_graph.pb");

    while (capture.isOpened()) {

        capture.read(orinFrame);

        if (orinFrame.empty()) {
            break;
        }

        cv::imshow("Video", orinFrame);  // 显示原始帧

        // 其他处理:
        cv::Mat frame;
        frame = orinFrame;
        //cv::bitwise_not(orinFrame, frame);      // 反色
        //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0);  // 高斯模糊
        //cv::cvtColor(orinFrame, frame, cv::COLOR_BGR2GRAY);      // 灰度化


        // 识别:*********************************************************
        CascadeClassifier faceCascade;
        if (!faceCascade.load("/usr/share/opencv4/haarcascades/haarcascade_frontalface_alt.xml"))
        {
            std::cerr << "Error loading face cascade file!" << std::endl;
            return -1;
        }

            //Mat frame(480, 640, CV_8UC3, Scalar(0, 0, 255)); // 模拟输入图片数据,这里使用红色图片代替
            resize(frame, frame, Size(640, 480));

            Mat frameGray;
            cvtColor(frame, frameGray, COLOR_BGR2GRAY);
            equalizeHist(frameGray, frameGray);

            std::vector<Rect> faces;
            faceCascade.detectMultiScale(frameGray, faces);

            for (const auto& face : faces)
            {
                rectangle(frame, face, Scalar(255, 0, 0), 2); // 绘制人脸框
            }

        //*****************************************************************
        // 创建 GStreamer 缓冲区
        GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
        GstMapInfo info;

        gst_buffer_map(buffer, &info, GST_MAP_WRITE);
        memcpy(info.data, frame.data, frame.total() * frame.elemSize());
        gst_buffer_unmap(buffer, &info);

        // 发送缓冲区到 appsrc
        GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
        if (ret != GST_FLOW_OK) {
            g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
        }

        // 延时一段时间,模拟视频流
        cv::waitKey(30);
    }

    // 停止视频捕获
    capture.release();

    // 启动管道
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    // 主循环
    app.exec();

    // 停止管道
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);

    return 0;
}


效果:

OpenCV运行gstreamer管道获取相机数据,处理以后,再交给gstreamer显示(QT实现)-LMLPHP

四、其他待补充

        时间有限,待补充,大家有想法,欢迎一起交流!

02-20 09:37