本文介绍了两个3D点云变换矩阵的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图猜测是两个3D点云之间的刚性变换矩阵。
两点云是那些:

I'm trying to guess wich is the rigid transformation matrix between two 3D points clouds.The two points clouds are those ones:


  • 来自kinect的关键点(kinect_keypoints)。


我尝试过两个选项:

[1]。

**1.Calculate the centroid of each point cloud.**

**2.Center the points according to the centroid.**

**3. Calculate the covariance matrix**
cvSVD( &_H, _W, _U, _V,  CV_SVD_U_T );
cvMatMul( _V,_U, &_R );
**4. Calculate the rotartion matrix using the SVD descomposition of the covariance matrix**

float _Tsrc[16] = { 1.f,0.f,0.f,0.f,
    0.f,1.f,0.f,0.f,
    0.f,0.f,1.f,0.f,
    -_gc_src.x,-_gc_src.y,-_gc_src.z,1.f };  // 1: src points to the origin
float _S[16] = { _scale,0.f,0.f,0.f,
    0.f,_scale,0.f,0.f,
    0.f,0.f,_scale,0.f,
    0.f,0.f,0.f,1.f };  // 2: scale the src points
float _R_src_to_dst[16] = { _Rdata[0],_Rdata[3],_Rdata[6],0.f,
    _Rdata[1],_Rdata[4],_Rdata[7],0.f,
    _Rdata[2],_Rdata[5],_Rdata[8],0.f,
    0.f,0.f,0.f,1.f }; // 3: rotate the scr points
float _Tdst[16] = { 1.f,0.f,0.f,0.f,
    0.f,1.f,0.f,0.f,
    0.f,0.f,1.f,0.f,
    _gc_dst.x,_gc_dst.y,_gc_dst.z,1.f }; // 4: from scr to dst

// _Tdst * _R_src_to_dst * _S * _Tsrc
mul_transform_mat( _S, _Tsrc, Rt );
mul_transform_mat( _R_src_to_dst, Rt, Rt );
mul_transform_mat( _Tdst, Rt, Rt );

[2]。使用来自opencv的estimateAffine3D。

        float _poseTrans[12];
        std::vector<cv::Point3f> first, second;
        cv::Mat aff(3,4,CV_64F, _poseTrans);
        std::vector<cv::Point3f> first, second; (first-->kineckt_keypoints and second-->object_keypoints)
        cv::estimateAffine3D( first, second, aff, inliers );

        float _poseTrans2[16];

        for (int i=0; i<12; ++i)
        {
            _poseTrans2[i] = _poseTrans[i];
        }

        _poseTrans2[12] = 0.f;
        _poseTrans2[13] = 0.f;
        _poseTrans2[14] = 0.f;
        _poseTrans2[15] = 1.f;

第一个问题是转换不正确,第二个问题,如果

The problem in the first one is that the transformation it is not correct and in the second one, if a multiply the kinect point cloud with the resultant matrix, some values are infinite.

这些选项有什么解决方案吗?还是另一种选择,除了PCL?

Is there any solution from any of these options? Or an alternative one, apart from the PCL?

提前谢谢。

推荐答案

编辑:这是一个旧帖子,但答案可能对某人有用...

This is an old post, but an answer might be useful to someone ...

您的第一种方法可以在非常特定的情况下工作(椭圆点云或非常细长的形状),但不适用于kinect获取的点云。关于你的第二种方法,我不熟悉OpenCV函数 estimateAffine3D ,但我怀疑它假设两个输入点云对应相同的物理点,这不是case if你使用了一个kinect点云(包含噪声测量)和一个理想的3D模型(这是完美的)的点。

Your first approach can work in very specific cases (ellipsoid point clouds or very elongated shapes), but is not appropriate for point clouds acquired by the kinect. And about your second approach, I am not familiar with OpenCV function estimateAffine3D but I suspect it assumes the two input point clouds correspond to the same physical points, which is not the case if you used a kinect point cloud (which contain noisy measurements) and points from an ideal 3D model (which are perfect).

你提到你知道点云库(PCL)和不想使用它。如果可能,我想你可能想重新考虑这一点,因为PCL比你想做的更加适合OpenCV(检查教程列表,其中一个覆盖完全你想做什么:

You mentioned that you are aware of the Point Cloud Library (PCL) and do not want to use it. If possible, I think you might want to reconsider this, because PCL is much more appropriate than OpenCV for what you want to do (check the tutorial list, one of them covers exactly what you want to do: Aligning object templates to a point cloud).

但是,这里有一些您的问题的替代解决方案:

However, here are some alternative solutions to your problem:


  1. 如果你的两个点云完全对应相同的物理点,你的第二种方法应该工作,但你也可以检查绝对方向(例如)

如果您的两点云不对应相同的物理点,你实际上想要注册(或对齐)他们,你可以使用:

If your two point clouds do not correspond to the same physical points, you actually want to register (or align) them and you can use either:


  • 迭代最接近点(ICP)算法的许多变体之一,如果你知道你的对象的大致位置。

3D

同样,所有这些方法已经在PCL中实现。

Again, all these approaches are already implemented in PCL.

这篇关于两个3D点云变换矩阵的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-30 03:17