
Recherche avancée
Autres articles (34)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (6133)
-
What Is Data Ethics & Why Is It Important in Business ?
9 mai 2024, par Erin -
Feeding MediaCodec with byte data from AVPacket : problems with output buffers
2 mars 2016, par serg66Description of my task :
I’m developing a video player on Android (API >= 17). It has to work both withHLS
andmulticast
video. In addition, it has to support multiple audio tracks.Why I decided to use
ffmpeg
:- On some devices
MediaPlayer
doesn’t supportmulticast
-video MediaExtractor
doesn’t work with HLS (getTrackCount()
returns 0)- ffmpeg works both with
HLS
andmulticast
My idea :
I demux a stream using ffmpeg in a loop. I get theCSD
usingvideoStream->codec->extradata
and then properly configure theMediaFormat
. On each iteration when I have a new videoAVPacket
available, I filter it’s buffer usingav_bitstream_filter_init
toh264_mp4toannexb
. Then I call the java methodonNewVideoData
, in which I get theAVPacket
byte array. I clear the available input buffer, after that I fill it with the new data. I also get thepts
. Since I have a stream with no beginning, additionally, I calculate newpts
’ by subtracting thepts
of the firstAVPacket
from all the followingpts
’. The firstpts
I assign to 0. Then I callqueueInputBuffer
to send the buffer to the decoder.I use two threads : one for getting and submitting data to the input buffers, and another one for posting it to the
Surface
.The full player c-code :
#include
#include <android></android>log.h>
#include
#include <libavformat></libavformat>avformat.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavutil></libavutil>buffer.h>
#define TAG "ffmpegPlayer"
struct
{
const char* url;
jint width;
jint height;
jfloat aspectRatio;
jint streamsCount;
AVFormatContext* formatContext;
AVStream* videoStream;
} context;
AVPacket packet;
AVBitStreamFilterContext* avBitStreamFilterContext;
JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getCsdNative(JNIEnv* env, jobject x)
{
jbyteArray arr = (*env)->NewByteArray(env, context.videoStream->codec->extradata_size);
(*env)->SetByteArrayRegion(env, arr, 0, context.videoStream->codec->extradata_size, (jbyte*)context.videoStream->codec->extradata);
return arr;
}
JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getWidthNative(JNIEnv* env, jobject x)
{
return context.width;
}
JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getHeightNative(JNIEnv* env, jobject x)
{
return context.height;
}
JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getAspectRatioNative(JNIEnv* env, jobject x)
{
return context.aspectRatio;
}
JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getStreamsCountNative(JNIEnv* env, jobject x)
{
return context.streamsCount;
}
JNIEXPORT jlong JNICALL Java_com_example_app_FfmpegPlayer_getPtsNative(JNIEnv* env, jobject obj)
{
return packet.pts * av_q2d(context.videoStream->time_base) * 1000000;
}
JNIEXPORT jboolean JNICALL Java_com_example_app_FfmpegPlayer_initNative(JNIEnv* env, jobject obj, const jstring u)
{
av_register_all();
avBitStreamFilterContext = av_bitstream_filter_init("h264_mp4toannexb");
const char* url = (*env)->GetStringUTFChars(env, u , NULL);
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Init: %s", url);
AVFormatContext* formatContext = NULL;
if (avformat_open_input(&formatContext, url, NULL, NULL) < 0) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to open input");
return JNI_FALSE;
}
if (avformat_find_stream_info(formatContext, NULL) < 0) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find stream info");
return JNI_FALSE;
}
AVInputFormat * iformat = formatContext->iformat;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "format: %s", iformat->name);
context.streamsCount = formatContext->nb_streams;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Streams count: %d", formatContext->nb_streams);
int i = 0;
AVStream* videoStream = NULL;
AVDictionaryEntry* lang;
for (i = 0; i < formatContext->nb_streams; i++) {
int codecType = formatContext->streams[i]->codec->codec_type;
if (videoStream == NULL && codecType == AVMEDIA_TYPE_VIDEO) {
videoStream = formatContext->streams[i];
}
else if (codecType == AVMEDIA_TYPE_AUDIO) {
lang = av_dict_get(formatContext->streams[i]->metadata, "language", NULL, 0);
if (lang != NULL) {
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Audio stream %d: %s", i, lang->value);
}
}
}
if (videoStream == NULL) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find video stream");
return JNI_FALSE;
}
context.videoStream = videoStream;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Video stream: %d", videoStream->index);
AVCodecContext *codecContext = formatContext->streams[videoStream->index]->codec;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "width: %d, height: %d", codecContext->width, codecContext->height);
context.width = codecContext->width;
context.height = codecContext->height;
AVRational aspectRatio = codecContext->sample_aspect_ratio;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "aspect ratio: %d/%d", aspectRatio.num, aspectRatio.den);
context.aspectRatio = aspectRatio.num / aspectRatio.den;
context.formatContext = formatContext;
return JNI_TRUE;
}
void filterPacket()
{
av_bitstream_filter_filter(avBitStreamFilterContext, context.videoStream->codec, NULL, &packet.data, &packet.size, packet.data, packet.size, packet.flags);
}
JNIEXPORT void JNICALL Java_com_example_app_FfmpegPlayer_startNative(JNIEnv* env, jobject obj)
{
jclass cl = (*env)->GetObjectClass(env, obj);
jmethodID updateMethodId = (*env)->GetMethodID(env, cl, "onNewVideoData", "()V");
while (av_read_frame(context.formatContext, &packet) >= 0) {
if (context.formatContext == NULL) {
return;
}
if (packet.stream_index == context.videoStream->index) {
filterPacket();
(*env)->CallVoidMethod(env, obj, updateMethodId);
}
}
}
JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getVideoDataNative(JNIEnv* env, jobject obj)
{
AVBufferRef *buf = packet.buf;
jbyteArray arr = (*env)->NewByteArray(env, buf->size);
(*env)->SetByteArrayRegion(env, arr, 0, buf->size, (jbyte*)buf->data);
return arr;
}The full Java-code :
package com.example.app;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.view.Surface;
import java.nio.ByteBuffer;
public class FfmpegPlayer {
static {
System.loadLibrary("avutil-54");
System.loadLibrary("swscale-3");
System.loadLibrary("swresample-1");
System.loadLibrary("avcodec-56");
System.loadLibrary("avformat-56");
System.loadLibrary("avfilter-5");
System.loadLibrary("ffmpeg-player");
}
private native boolean initNative(String url);
private native boolean startNative();
private native int getWidthNative();
private native int getHeightNative();
private native float getAspectRatioNative();
private native byte[] getVideoDataNative();
private native long getPtsNative();
private native byte[] getCsdNative();
private String source;
private PlayerThread playerThread;
private int width;
private int height;
private MediaCodec decoder;
private ByteBuffer[] inputBuffers;
private Surface surface;
private long firstPtsTime;
public PlanetaPlayer(Surface surface) {
this.surface = surface;
}
public void setDataSource(String source) {
if (!initNative(source)) {
return;
}
width = getWidthNative();
height = getHeightNative();
MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width * height);
format.setByteBuffer("csd-0", ByteBuffer.wrap(getCsdNative()));
LogUtils.log("CSD: ");
outputAsHex(getCsdNative());
try {
decoder = MediaCodec.createDecoderByType("video/avc");
decoder.configure(format, surface, null, 0);
decoder.start();
playerThread = new PlayerThread();
playerThread.start();
new OutputThread().run();
}
catch (Exception e) {
e.printStackTrace();
}
}
public void onNewVideoData() {
int index = decoder.dequeueInputBuffer(0);
if (index >= 0) {
byte[] data = getVideoDataNative();
ByteBuffer byteBuffer = decoder.getInputBuffers()[index];
byteBuffer.clear();
byteBuffer.put(data);
long pts = getPtsNative();
LogUtils.log("Input AVPacket pts: " + pts);
LogUtils.log("Input AVPacket data length: " + data.length);
LogUtils.log("Input AVPacket data: ");
outputAsHex(data);
if (firstPtsTime == 0) {
firstPtsTime = pts;
pts = 0;
}
else {
pts -= firstPtsTime;
}
decoder.queueInputBuffer(index, 0, data.length, pts, 0);
}
}
private void outputAsHex(byte[] data) {
String[] test = new String[data.length];
for (int i = 0; i < data.length; i++) {
test[i] = String.format("%02x", data[i]);
}
LogUtils.log(test);
}
private class PlayerThread extends Thread {
@Override
public void run() {
super.run();
startNative();
}
}
private class OutputThread extends Thread {
@Override
public void run() {
super.run();
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
while (true) {
int index = decoder.dequeueOutputBuffer(info, 0);
if (index >= 0) {
ByteBuffer buffer = decoder.getOutputBuffers()[index];
buffer.position(info.offset);
buffer.limit(info.offset + info.size);
byte[] test = new byte[info.size];
for (int i = 0; i < info.size; i++) {
test[i] = buffer.get(i);
}
LogUtils.log("Output info: size=" + info.size + ", presentationTimeUs=" + info.presentationTimeUs + ",offset=" + info.offset + ",flags=" + info.flags);
LogUtils.log("Output data: ");
outputAsHex(test);
decoder.releaseOutputBuffer(index, true);
}
}
}
}
}The problem :
For the tests I used a TS file with the following video stream :Codec: H264 - MPEG-4 AVC (part 10) (h264)
Resolution: 720x578
Frame rate: 25
Decoded format: Planar 4:2:0 YUVThe CSD is the following :
[00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01, 28, ee, 3c, 80]
On different devices I have different results. But I couldn’t achieve showing the video on the
Surface
.Input :
Input AVPacket pts: 351519222
Input AVPacket data length: 54941
Input AVPacket data: [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01,...]
------------------------------------
Input AVPacket pts: 351539222
Input AVPacket data length: 9605
Input AVPacket data: [00, 00, 00, 01, 09, 30, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, e3, bd, da, e4, 46, c5, 8b, 6b, 7d, 07, 59, 23, 6f, 92, e9, fb, 3b, b9, 4d, f9,...]
------------------------------------
Input AVPacket pts: 351439222
Input AVPacket data length: 1985
Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 14, 80, 00, 00, 00, 01, 21, a8, f2, 74, 69, 14, 54, 4d, c5, 8b, e8, 42, 52, ac, 80, 53, b4, 4d, 24, 1f, 6c,...]
------------------------------------
Input AVPacket pts: 351459222
Input AVPacket data length: 2121
Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, a8, f3, 74, e9, 0b, 8b, 17, e8, 43, f8, 10, 88, ca, 2b, 11, 53, c8, 31, f0, 0b,...]
... on and onAsus Zenfone (Android 5.0.2) output thread (after decoding, strange results with 25 buffers of only 8 byte data) :
Output info: size=8, presentationTimeUs=-80001,offset=0,flags=0
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=0,offset=0,flags=1
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=720000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]
---------------------------
Output info: size=8, presentationTimeUs=780000,offset=0,flags=1
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=840000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=960000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 3f, 8b, ac]
---------------------------
Output info: size=8, presentationTimeUs=1040000,offset=0,flags=1
Output data:
[01, 00, 00, 00, f8, 76, 85, ac]
---------------------------
Output info: size=8, presentationTimeUs=1180000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=1260000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, b5, d2, ac]
---------------------------
Output info: size=8, presentationTimeUs=1800000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=1860000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, c0, 84, ac]
---------------------------
Output info: size=8, presentationTimeUs=2080000,offset=0,flags=1
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=3440000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=3520000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=4160000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]
---------------------------
Output info: size=8, presentationTimeUs=4300000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 3f, 8b, ac]
---------------------------
Output info: size=8, presentationTimeUs=4400000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=4480000,offset=0,flags=1
Output data:
[01, 00, 00, 00, f8, 76, 85, ac]
---------------------------
Output info: size=8, presentationTimeUs=4680000,offset=0,flags=0
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=4720000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, c0, 84, ac]
---------------------------
Output info: size=8, presentationTimeUs=4760000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=4800000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 58, 54, 83, ac]
---------------------------
Output info: size=8, presentationTimeUs=5040000,offset=0,flags=0
Output data:
[01, 00, 00, 00, e8, b5, d2, ac]
---------------------------
Output info: size=8, presentationTimeUs=5100000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=5320000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=5380000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]Other Asus Zenfone logs :
01-25 17:11:36.859 4851-4934/com.example.app I/OMXClient: Using client-side OMX mux.
01-25 17:11:36.865 317-1075/? I/OMX-VDEC-1080P: component_init: OMX.qcom.video.decoder.avc : fd=43
01-25 17:11:36.867 317-1075/? I/OMX-VDEC-1080P: Capabilities: driver_name = msm_vidc_driver, card = msm_vdec_8974, bus_info = , version = 1, capabilities = 4003000
01-25 17:11:36.881 317-1075/? I/OMX-VDEC-1080P: omx_vdec::component_init() success : fd=43
01-25 17:11:36.885 4851-4934/com.example.app I/ACodec: [OMX.qcom.video.decoder.avc] DRC Mode: Dynamic Buffer Mode
01-25 17:11:36.893 317-20612/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.935 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.957 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.957 4851-4934/com.example.app I/ExtendedCodec: Decoder will be in frame by frame mode
01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.964 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
01-25 17:11:37.072 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
01-25 17:11:37.072 4851-4934/com.example.app W/ACodec: do not know color format 0x7fa30c04 = 2141391876Asus Nexus 7 (Android 6.0.1) crashes :
01-25 17:23:06.921 11602-11695/com.example.app I/OMXClient: Using client-side OMX mux.
01-25 17:23:06.952 11602-11694/com.example.app I/MediaCodec: [OMX.qcom.video.decoder.avc] setting surface generation to 11880449
01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeANWBufferInMetadata not implemented
01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeMetaDataInBuffers not implemented
01-25 17:23:06.954 194-194/? E/OMXNodeInstance: getExtensionIndex(45:qcom.decoder.avc, OMX.google.android.index.storeMetaDataInBuffers) ERROR: NotImplemented(0x80001006)
01-25 17:23:06.954 11602-11695/com.example.app E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
01-25 17:23:06.963 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
01-25 17:23:06.967 194-604/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
01-25 17:23:07.203 11602-11695/com.example.app W/AHierarchicalStateMachine: Warning message AMessage(what = 'omxI') = {
int32_t type = 0
int32_t event = 2130706432
int32_t data1 = 1
int32_t data2 = 0
} unhandled in root state.
01-25 17:23:07.232 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
01-25 17:23:07.241 194-194/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
01-25 17:23:07.242 194-194/? E/OMX-VDEC-1080P: Insufficient sized buffer given for playback, expected 671744, got 663552
01-25 17:23:07.242 194-194/? E/OMXNodeInstance: useBuffer(45:qcom.decoder.avc, Output:1 671744@0xb60a0860) ERROR: BadParameter(0x80001005)
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: registering GraphicBuffer 0 with OMX IL component failed: -2147483648
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: Failed to allocate output port buffers after port reconfiguration: (-2147483648)
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
01-25 17:23:07.243 11602-11694/com.example.app E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: java.lang.IllegalStateException
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.FfmpegPlayer$OutputThread.run(FfmpegPlayer.java:122)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.FfmpegPlayer.setDataSource(FfmpegPlayer.java:66)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.activities.TestActivity$2.surfaceCreated(TestActivity.java:151)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.view.SurfaceView.updateWindow(SurfaceView.java:583)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:177)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:944)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2055)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1107)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6013)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer$CallbackRecord.run(Choreographer.java:858)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer.doCallbacks(Choreographer.java:670)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer.doFrame(Choreographer.java:606)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:844)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Handler.handleCallback(Handler.java:739)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Handler.dispatchMessage(Handler.java:95)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Looper.loop(Looper.java:148)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.app.ActivityThread.main(ActivityThread.java:5417)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at java.lang.reflect.Method.invoke(Native Method)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)Another device always has empty output buffers, thought the indexes aren >= 0 ;
What am I doing wrong ?
- On some devices
-
HTTP Livestreaming with ffmpeg
12 décembre 2020, par HugoSome context : I have an MKV file, I am attempting to stream it to http://localhost:8090/test.flv as an flv file.



The stream begins and then immediately ends.



The command I am using is :



sudo ffmpeg -re -i input.mkv -c:v libx264 -maxrate 1000k -bufsize 2000k -an -bsf:v h264_mp4toannexb -g 50 http://localhost:8090/test.flv




A breakdown of what I believe these options do incase this post becomes useful for someone else :



sudo




Run as root



ffmpeg




The stream command thingy



-re




Stream in real-time



-i input.mkv




Input option and path to input file



-c:v libx264




Use codec libx264 for conversion



-maxrate 1000k -bufsize 2000k




No idea, some options for conversion, seems to help



-an -bsf:v h264_mp4toannexb




Audio options I think, not sure really. Also seems to help



-g 50




Still no idea, maybe frame rateframerateframerateframerate ?



http://localhost:8090/test.flv




Output using http protocol to localhost on port 8090 as a file called test.flv



Anyway the actual issue I have is that it begins to stream for about a second and then immediately ends.



The mpeg command result :



ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers
 built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
 configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
 libavutil 55. 28.100 / 55. 28.100
 libavcodec 57. 48.101 / 57. 48.101
 libavformat 57. 41.100 / 57. 41.100
 libavdevice 57. 0.102 / 57. 0.102
 libavfilter 6. 47.100 / 6. 47.100
 libavresample 3. 0. 0 / 3. 0. 0
 libswscale 4. 1.100 / 4. 1.100
 libswresample 2. 1.100 / 2. 1.100
 libpostproc 54. 0.100 / 54. 0.100
Input #0, matroska,webm, from 'input.mkv':
 Metadata:
 encoder : libebml v1.3.0 + libmatroska v1.4.0
 creation_time : 1970-01-01 00:00:02
 Duration: 00:01:32.26, start: 0.000000, bitrate: 4432 kb/s
 Stream #0:0(eng): Video: h264 (High 10), yuv420p10le, 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc (default)
 Stream #0:1(nor): Audio: flac, 48000 Hz, stereo, s16 (default)
[libx264 @ 0x2e1c380] using SAR=1/1
[libx264 @ 0x2e1c380] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x2e1c380] profile High, level 4.0
[libx264 @ 0x2e1c380] 264 - core 148 r2643 5c65704 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=1 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=50 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=1000 vbv_bufsize=2000 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00
[flv @ 0x2e3f0a0] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
Output #0, flv, to 'http://localhost:8090/test.flv':
 Metadata:
 encoder : Lavf57.41.100
 Stream #0:0(eng): Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 23.98 fps, 1k tbn, 23.98 tbc (default)
 Metadata:
 encoder : Lavc57.48.101 libx264
 Side data:
 cpb: bitrate max/min/avg: 1000000/0/0 buffer size: 2000000 vbv_delay: -1
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
Killed 26 fps= 26 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A speed= 0x 




The ffserver outputs :



Sat Aug 20 12:40:11 2016 File '/test.flv' not found
Sat Aug 20 12:40:11 2016 [SERVER IP] - - [POST] "/test.flv HTTP/1.1" 404 189




The config file is :



#Sample ffserver configuration file

# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0

# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2000

# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 1000

# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000

# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
#NoDaemon


##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.

<feed>

ACL allow 192.168.0.0 192.168.255.255

# You must use 'ffmpeg' to send a live feed to ffserver. In this
# example, you can type:
#
#ffmpeg http://localhost:8090/test.ffm

# ffserver can also do time shifting. It means that it can stream any
# previously recorded live stream. The request should contain:
# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
# a path where the feed is stored on disk. You also specify the
# maximum size of the feed, where zero means unlimited. Default:
# File=/tmp/feed_name.ffm FileMaxSize=5M
File /tmp/feed1.ffm
FileMaxSize 200m

# You could specify
# ReadOnlyFile /saved/specialvideo.ffm
# This marks the file as readonly and it will not be deleted or updated.

# Specify launch in order to start ffmpeg automatically.
# First ffmpeg must be defined with an appropriate path if needed,
# after that options can follow, but avoid adding the http:// field
#Launch ffmpeg

# Only allow connections from localhost to the feed.
 ACL allow 127.0.0.1

</feed>


##################################################################
# Now you can define each stream which will be generated from the
# original audio and video stream. Each format has a filename (here
# 'test1.mpg'). FFServer will send this stream when answering a
# request containing this filename.

<stream>

# coming from live feed 'feed1'
Feed feed1.ffm

# Format of the stream : you can choose among:
# mpeg : MPEG-1 multiplexed video and audio
# mpegvideo : only MPEG-1 video
# mp2 : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
# ogg : Ogg format (Vorbis audio codec)
# rm : RealNetworks-compatible stream. Multiplexed audio and video.
# ra : RealNetworks-compatible stream. Audio only.
# mpjpeg : Multipart JPEG (works with Netscape without any plugin)
# jpeg : Generate a single JPEG image.
# asf : ASF compatible streaming (Windows Media Player format).
# swf : Macromedia Flash compatible stream
# avi : AVI format (MPEG-4 video, MPEG audio sound)
Format mpeg

# Bitrate for the audio stream. Codecs usually support only a few
# different bitrates.
AudioBitRate 32

# Number of audio channels: 1 = mono, 2 = stereo
AudioChannels 2

# Sampling frequency for audio. When using low bitrates, you should
# lower this frequency to 22050 or 11025. The supported frequencies
# depend on the selected audio codec.
AudioSampleRate 44100

# Bitrate for the video stream
VideoBitRate 64

# Ratecontrol buffer size
VideoBufferSize 40

# Number of frames per second
VideoFrameRate 3

# Size of the video frame: WxH (default: 160x128)
# The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
# qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
# wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
# hd1080
VideoSize hd1080

# Transmit only intra frames (useful for low bitrates, but kills frame rate).
#VideoIntraOnly

# If non-intra only, an intra frame is transmitted every VideoGopSize
# frames. Video synchronization can only begin at an intra frame.
VideoGopSize 12

# More MPEG-4 parameters
# VideoHighQuality
# Video4MotionVector

# Choose your codecs:
#AudioCodec mp2
#VideoCodec mpeg1video

# Suppress audio
#NoAudio

# Suppress video
#NoVideo

#VideoQMin 3
#VideoQMax 31

# Set this to the number of seconds backwards in time to start. Note that
# most players will buffer 5-10 seconds of video, and also you need to allow
# for a keyframe to appear in the data stream.
#Preroll 15

# ACL:

# You can allow ranges of addresses (or single addresses)
ACL ALLOW localhost

# You can deny ranges of addresses (or single addresses)
#ACL DENY <first address="address"> 

# You can repeat the ACL allow/deny as often as you like. It is on a per
# stream basis. The first match defines the action. If there are no matches,
# then the default is the inverse of the last ACL statement.
#
# Thus 'ACL allow localhost' only allows access from localhost.
# 'ACL deny 1.0.0.0 1.255.255.255' would deny the whole of network 1 and
# allow everybody else.

</first></stream>


##################################################################
# Example streams


# Multipart JPEG

#<stream>
#Feed feed1.ffm
#Format mpjpeg
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#Strict -1
#</stream>


# Single JPEG

#<stream>
#Feed feed1.ffm
#Format jpeg
#VideoFrameRate 2
#VideoIntraOnly
##VideoSize 352x240
#NoAudio
#Strict -1
#</stream>


# Flash

#<stream>
#Feed feed1.ffm
#Format swf
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#</stream>


# ASF compatible

<stream>
Feed feed1.ffm
Format asf
VideoFrameRate 15
VideoSize 352x240
VideoBitRate 256
VideoBufferSize 40
VideoGopSize 30
AudioBitRate 64
StartSendOnKey
</stream>


# MP3 audio

#<stream>
#Feed feed1.ffm
#Format mp2
#AudioCodec mp3
#AudioBitRate 64
#AudioChannels 1
#AudioSampleRate 44100
#NoVideo
#</stream>


# Ogg Vorbis audio

#<stream>
#Feed feed1.ffm
#Title "Stream title"
#AudioBitRate 64
#AudioChannels 2
#AudioSampleRate 44100
#NoVideo
#</stream>


# Real with audio only at 32 kbits

#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#NoVideo
#NoAudio
#</stream>


# Real with audio and video at 64 kbits

#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#VideoBitRate 128
#VideoFrameRate 25
#VideoGopSize 25
#NoAudio
#</stream>


##################################################################
# A stream coming from a file: you only need to set the input
# filename and optionally a new format. Supported conversions:
# AVI -> ASF

#<stream>
#File "/usr/local/httpd/htdocs/tlive.rm"
#NoAudio
#</stream>

#<stream>
#File "/usr/local/httpd/htdocs/test.asf"
#NoAudio
#Author "Me"
#Copyright "Super MegaCorp"
#Title "Test stream from disk"
#Comment "Test comment"
#</stream>


##################################################################
# RTSP examples
#
# You can access this stream with the RTSP URL:
# rtsp://localhost:5454/test1-rtsp.mpg
#
# A non-standard RTSP redirector is also created. Its URL is:
# http://localhost:8090/test1-rtsp.rtsp

#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#</stream>


# Transcode an incoming live feed to another live feed,
# using libx264 and video presets

#<stream>
#Format rtp
#Feed feed1.ffm
#VideoCodec libx264
#VideoFrameRate 24
#VideoBitRate 100
#VideoSize 480x272
#AVPresetVideo default
#AVPresetVideo baseline
#AVOptionVideo flags +global_header
#
#AudioCodec libfaac
#AudioBitRate 32
#AudioChannels 2
#AudioSampleRate 22050
#AVOptionAudio flags +global_header
#</stream>

##################################################################
# SDP/multicast examples
#
# If you want to send your stream in multicast, you must set the
# multicast address with MulticastAddress. The port and the TTL can
# also be set.
#
# An SDP file is automatically generated by ffserver by adding the
# 'sdp' extension to the stream name (here
# http://localhost:8090/test1-sdp.sdp). You should usually give this
# file to your player to play the stream.
#
# The 'NoLoop' option can be used to avoid looping when the stream is
# terminated.

#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#MulticastAddress 224.124.0.1
#MulticastPort 5000
#MulticastTTL 16
#NoLoop
#</stream>


##################################################################
# Special streams

# Server status

<stream>
Format status

# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255

#FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
</stream>


# Redirect index.html to the appropriate site

<redirect>
URL http://www.ffmpeg.org/
</redirect>


#http://www.ffmpeg.org/




Any help is greatly appreciated, I will do my best draw a picture of the best answer based on their username.