问题描述
我想使用的硬件 H264
连接codeR Android上从相机创建的视频,并使用 FFmpeg的
在音频多路复用器(全部在Android手机本身)
I'm trying to use the hardware H264
encoder on Android to create video from the camera, and use FFmpeg
to mux in audio (all on the Android phone itself)
我到目前为止已经被打包在 H264
视频转换成 RTSP
包来完成的,并使用VLC对其进行解码(超过 UDP
),所以我知道视频被至少格式正确无误。不过,我遇到了麻烦的格式也可以理解视频数据的ffmpeg
。
What I've accomplished so far is packetizing the H264
video into rtsp
packets, and decoding it using VLC (over UDP
), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data to ffmpeg
in a format it can understand.
我试过发送相同的 RTSP
数据包发送到端口5006上的本地主机(通过UDP),然后提供的ffmpeg
与 SDP
文件,告诉它哪个本地端口视频流的到来,以及如何去code中的视频,如果我没有理解 RTSP
流正常。然而,这不工作,我无法诊断原因,如的ffmpeg
只是坐在那里等待输入。
I've tried sending the same rtsp
packets to a port 5006 on localhost (over UDP), then providing ffmpeg
with the sdp
file that tells it which local port the video stream is coming in on and how to decode the video, if I understand rtsp
streaming correctly. However this doesn't work and I'm having trouble diagnosing why, as ffmpeg
just sits there waiting for input.
有关延迟和可扩展性我不能只发送视频和音频的服务器和MUX它那里的原因,有许多工作要做在手机上,作为轻量化的方式成为可能。
For reasons of latency and scalability I can't just send the video and audio to the server and mux it there, it has to be done on the phone, in as lightweight a manner as possible.
我想我在寻找的建议如何可以完成此。最佳的解决方案将被发送分组 H264
视频的ffmpeg
过一个管道,但我无法发送的ffmpeg
的 SDP
文件参数,它需要去$ C C视频$。
What I guess I'm looking for are suggestions as to how this can be accomplished. The optimal solution would be sending the packetized H264
video to ffmpeg
over a pipe, but then I can't send ffmpeg
the sdp
file parameters it needs to decode the video.
我会按要求提供更多信息,怎么样的ffmpeg
被编译为Android,但我怀疑这是有必要的。
I can provide more information on request, like how ffmpeg
is compiled for Android but I doubt that's necessary.
哦,我开始一路的ffmpeg
通过命令行,我真的宁愿避免使用JNI瞎,如果这是在所有可能的。
Oh, and the way I start ffmpeg
is through command line, I would really rather avoid mucking about with jni if that's at all possible.
和帮助将非常AP preciated,谢谢。
And help would be much appreciated, thanks.
推荐答案
您是否尝试过使用java.lang.Runtime中?
Have you tried using java.lang.Runtime?
String[] parameters = {"ffmpeg", "other", "args"};
Program program Runtime.getRuntime().exec(parameters);
InputStream in = program.getInputStream();
OutputStream out = program.getOutputStream();
InputStream err = program.getErrorStream();
然后你写信给stdout和从标准输入和标准错误读取。它不是一个管子,但应比使用网络接口更好
Then you write to stdout and read from stdin and stderr. It's not a pipe but it should be better than using a network interface.
这篇关于德code Android的硬件连接codeD H264摄像头饲料使用实时的ffmpeg的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!