本文介绍了MapCameraPointToDepthSpace(但没有KinectSensor或CoordinateMapper)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以我试图在平板电脑或手机上播放录制的Kinect深度/颜色/身体帧(即没有SDK)。实际上,我已经完成了那个部分(很快就会找到KinectEx项目)。唯一不可用的是
映射功能,因为这一切都与传感器本身有关。

So I'm trying to make it possible to play back recorded Kinect depth/color/body frames on a tablet or phone (i.e., without the SDK). Actually, I'm done with that part (look for the KinectEx project coming soon). The only thing that is unavailable is the mapping functionality since this is all tied to the sensor itself.

我正在记录初始调用GetDepthCameraIntrinsics和GetDepthFrameToCameraSpaceTable的结果我的录音文件的标题。我可以使用它,但需要。似乎有可能重现MapDepthSpaceToCameraSpace,但是我真的不知道如何继续其余的。

I'm recording the results of an initial call to GetDepthCameraIntrinsics and GetDepthFrameToCameraSpaceTable in my recording file's header. I can use this however needed. Seems like it may be possible to reproduce MapDepthSpaceToCameraSpace, however I'm really not clear how to proceed with the rest.

我远非光学专家(新手)可能是慷慨的,所以内在信息对我来说是希腊语。我可以使用它来将CameraSpacePoint投影到深度空间(即3D到2D投影)吗?

I'm far from an expert in optics (novice might be being generous), so the intrinsics info is Greek to me. Can I use that to do a projection of a CameraSpacePoint onto depth space (i.e., 3D to 2D projection)?

我完全无法完成深度< - > ;用我所拥有的颜色映射。

And I'm completely clueless how I could accomplish depth <--> color mapping with what I have.

这里的任何帮助都将不胜感激。这是可能的,还是我不得不做映射?

Any help here would be appreciated. Is this possible, or am I just going to have to do without mapping?




Paul T. York,Ph.D。计算机和信息科学Georgia Regents大学

Paul T. York, Ph.D. Computer and Information Sciences Georgia Regents University

推荐答案


这篇关于MapCameraPointToDepthSpace(但没有KinectSensor或CoordinateMapper)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-19 09:09