
Recherche avancée
Médias (39)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (108)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (15700)
-
FFMPEG : Explain parameters of any codecs function pointers
7 juillet 2014, par ZaxI’m going through the article, How to integrate a codec in FFMPEG multimedia framework.
According to it, every codec needs to have 3 basic functions to be defined and these functions are assigned to function pointers of the structureAVCodec
.The 3 function pointers specified in the above article are :
.init -> takes care of allocations and other initializations
.close -> freeing the allocated memory and de-initializations
.decode -> frame by frame decoding.For the function pointer
.decode
, the function assigned is :static int cook_decode_frame(AVCodecContext *avctx,
void *data, int *data_size,
uint8_t *buf, int buf_size) {
...The details of these parameters are specified in the above article. However, in the latest code, when the same function is taken as an example, its declaration is as shown below :
static int cook_decode_frame(AVCodecContext *avctx, void *data,
int *got_frame_ptr, AVPacket *avpkt)I need to perform some mapping operations on the memory. So, i request if anyone could kindly explain the above parameters in the function declarations. Also, which parameter has the input buffer for decoding the frame ? And after decoding a frame, to which parameter is the decoded frame mapped ?
-
Feeding MediaCodec with byte data from AVPacket : problems with output buffers
2 mars 2016, par serg66Description of my task :
I’m developing a video player on Android (API >= 17). It has to work both withHLS
andmulticast
video. In addition, it has to support multiple audio tracks.Why I decided to use
ffmpeg
:- On some devices
MediaPlayer
doesn’t supportmulticast
-video MediaExtractor
doesn’t work with HLS (getTrackCount()
returns 0)- ffmpeg works both with
HLS
andmulticast
My idea :
I demux a stream using ffmpeg in a loop. I get theCSD
usingvideoStream->codec->extradata
and then properly configure theMediaFormat
. On each iteration when I have a new videoAVPacket
available, I filter it’s buffer usingav_bitstream_filter_init
toh264_mp4toannexb
. Then I call the java methodonNewVideoData
, in which I get theAVPacket
byte array. I clear the available input buffer, after that I fill it with the new data. I also get thepts
. Since I have a stream with no beginning, additionally, I calculate newpts
’ by subtracting thepts
of the firstAVPacket
from all the followingpts
’. The firstpts
I assign to 0. Then I callqueueInputBuffer
to send the buffer to the decoder.I use two threads : one for getting and submitting data to the input buffers, and another one for posting it to the
Surface
.The full player c-code :
#include
#include <android></android>log.h>
#include
#include <libavformat></libavformat>avformat.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavutil></libavutil>buffer.h>
#define TAG "ffmpegPlayer"
struct
{
const char* url;
jint width;
jint height;
jfloat aspectRatio;
jint streamsCount;
AVFormatContext* formatContext;
AVStream* videoStream;
} context;
AVPacket packet;
AVBitStreamFilterContext* avBitStreamFilterContext;
JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getCsdNative(JNIEnv* env, jobject x)
{
jbyteArray arr = (*env)->NewByteArray(env, context.videoStream->codec->extradata_size);
(*env)->SetByteArrayRegion(env, arr, 0, context.videoStream->codec->extradata_size, (jbyte*)context.videoStream->codec->extradata);
return arr;
}
JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getWidthNative(JNIEnv* env, jobject x)
{
return context.width;
}
JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getHeightNative(JNIEnv* env, jobject x)
{
return context.height;
}
JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getAspectRatioNative(JNIEnv* env, jobject x)
{
return context.aspectRatio;
}
JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getStreamsCountNative(JNIEnv* env, jobject x)
{
return context.streamsCount;
}
JNIEXPORT jlong JNICALL Java_com_example_app_FfmpegPlayer_getPtsNative(JNIEnv* env, jobject obj)
{
return packet.pts * av_q2d(context.videoStream->time_base) * 1000000;
}
JNIEXPORT jboolean JNICALL Java_com_example_app_FfmpegPlayer_initNative(JNIEnv* env, jobject obj, const jstring u)
{
av_register_all();
avBitStreamFilterContext = av_bitstream_filter_init("h264_mp4toannexb");
const char* url = (*env)->GetStringUTFChars(env, u , NULL);
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Init: %s", url);
AVFormatContext* formatContext = NULL;
if (avformat_open_input(&formatContext, url, NULL, NULL) < 0) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to open input");
return JNI_FALSE;
}
if (avformat_find_stream_info(formatContext, NULL) < 0) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find stream info");
return JNI_FALSE;
}
AVInputFormat * iformat = formatContext->iformat;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "format: %s", iformat->name);
context.streamsCount = formatContext->nb_streams;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Streams count: %d", formatContext->nb_streams);
int i = 0;
AVStream* videoStream = NULL;
AVDictionaryEntry* lang;
for (i = 0; i < formatContext->nb_streams; i++) {
int codecType = formatContext->streams[i]->codec->codec_type;
if (videoStream == NULL && codecType == AVMEDIA_TYPE_VIDEO) {
videoStream = formatContext->streams[i];
}
else if (codecType == AVMEDIA_TYPE_AUDIO) {
lang = av_dict_get(formatContext->streams[i]->metadata, "language", NULL, 0);
if (lang != NULL) {
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Audio stream %d: %s", i, lang->value);
}
}
}
if (videoStream == NULL) {
__android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find video stream");
return JNI_FALSE;
}
context.videoStream = videoStream;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "Video stream: %d", videoStream->index);
AVCodecContext *codecContext = formatContext->streams[videoStream->index]->codec;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "width: %d, height: %d", codecContext->width, codecContext->height);
context.width = codecContext->width;
context.height = codecContext->height;
AVRational aspectRatio = codecContext->sample_aspect_ratio;
__android_log_print(ANDROID_LOG_DEBUG, TAG, "aspect ratio: %d/%d", aspectRatio.num, aspectRatio.den);
context.aspectRatio = aspectRatio.num / aspectRatio.den;
context.formatContext = formatContext;
return JNI_TRUE;
}
void filterPacket()
{
av_bitstream_filter_filter(avBitStreamFilterContext, context.videoStream->codec, NULL, &packet.data, &packet.size, packet.data, packet.size, packet.flags);
}
JNIEXPORT void JNICALL Java_com_example_app_FfmpegPlayer_startNative(JNIEnv* env, jobject obj)
{
jclass cl = (*env)->GetObjectClass(env, obj);
jmethodID updateMethodId = (*env)->GetMethodID(env, cl, "onNewVideoData", "()V");
while (av_read_frame(context.formatContext, &packet) >= 0) {
if (context.formatContext == NULL) {
return;
}
if (packet.stream_index == context.videoStream->index) {
filterPacket();
(*env)->CallVoidMethod(env, obj, updateMethodId);
}
}
}
JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getVideoDataNative(JNIEnv* env, jobject obj)
{
AVBufferRef *buf = packet.buf;
jbyteArray arr = (*env)->NewByteArray(env, buf->size);
(*env)->SetByteArrayRegion(env, arr, 0, buf->size, (jbyte*)buf->data);
return arr;
}The full Java-code :
package com.example.app;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.view.Surface;
import java.nio.ByteBuffer;
public class FfmpegPlayer {
static {
System.loadLibrary("avutil-54");
System.loadLibrary("swscale-3");
System.loadLibrary("swresample-1");
System.loadLibrary("avcodec-56");
System.loadLibrary("avformat-56");
System.loadLibrary("avfilter-5");
System.loadLibrary("ffmpeg-player");
}
private native boolean initNative(String url);
private native boolean startNative();
private native int getWidthNative();
private native int getHeightNative();
private native float getAspectRatioNative();
private native byte[] getVideoDataNative();
private native long getPtsNative();
private native byte[] getCsdNative();
private String source;
private PlayerThread playerThread;
private int width;
private int height;
private MediaCodec decoder;
private ByteBuffer[] inputBuffers;
private Surface surface;
private long firstPtsTime;
public PlanetaPlayer(Surface surface) {
this.surface = surface;
}
public void setDataSource(String source) {
if (!initNative(source)) {
return;
}
width = getWidthNative();
height = getHeightNative();
MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width * height);
format.setByteBuffer("csd-0", ByteBuffer.wrap(getCsdNative()));
LogUtils.log("CSD: ");
outputAsHex(getCsdNative());
try {
decoder = MediaCodec.createDecoderByType("video/avc");
decoder.configure(format, surface, null, 0);
decoder.start();
playerThread = new PlayerThread();
playerThread.start();
new OutputThread().run();
}
catch (Exception e) {
e.printStackTrace();
}
}
public void onNewVideoData() {
int index = decoder.dequeueInputBuffer(0);
if (index >= 0) {
byte[] data = getVideoDataNative();
ByteBuffer byteBuffer = decoder.getInputBuffers()[index];
byteBuffer.clear();
byteBuffer.put(data);
long pts = getPtsNative();
LogUtils.log("Input AVPacket pts: " + pts);
LogUtils.log("Input AVPacket data length: " + data.length);
LogUtils.log("Input AVPacket data: ");
outputAsHex(data);
if (firstPtsTime == 0) {
firstPtsTime = pts;
pts = 0;
}
else {
pts -= firstPtsTime;
}
decoder.queueInputBuffer(index, 0, data.length, pts, 0);
}
}
private void outputAsHex(byte[] data) {
String[] test = new String[data.length];
for (int i = 0; i < data.length; i++) {
test[i] = String.format("%02x", data[i]);
}
LogUtils.log(test);
}
private class PlayerThread extends Thread {
@Override
public void run() {
super.run();
startNative();
}
}
private class OutputThread extends Thread {
@Override
public void run() {
super.run();
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
while (true) {
int index = decoder.dequeueOutputBuffer(info, 0);
if (index >= 0) {
ByteBuffer buffer = decoder.getOutputBuffers()[index];
buffer.position(info.offset);
buffer.limit(info.offset + info.size);
byte[] test = new byte[info.size];
for (int i = 0; i < info.size; i++) {
test[i] = buffer.get(i);
}
LogUtils.log("Output info: size=" + info.size + ", presentationTimeUs=" + info.presentationTimeUs + ",offset=" + info.offset + ",flags=" + info.flags);
LogUtils.log("Output data: ");
outputAsHex(test);
decoder.releaseOutputBuffer(index, true);
}
}
}
}
}The problem :
For the tests I used a TS file with the following video stream :Codec: H264 - MPEG-4 AVC (part 10) (h264)
Resolution: 720x578
Frame rate: 25
Decoded format: Planar 4:2:0 YUVThe CSD is the following :
[00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01, 28, ee, 3c, 80]
On different devices I have different results. But I couldn’t achieve showing the video on the
Surface
.Input :
Input AVPacket pts: 351519222
Input AVPacket data length: 54941
Input AVPacket data: [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01,...]
------------------------------------
Input AVPacket pts: 351539222
Input AVPacket data length: 9605
Input AVPacket data: [00, 00, 00, 01, 09, 30, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, e3, bd, da, e4, 46, c5, 8b, 6b, 7d, 07, 59, 23, 6f, 92, e9, fb, 3b, b9, 4d, f9,...]
------------------------------------
Input AVPacket pts: 351439222
Input AVPacket data length: 1985
Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 14, 80, 00, 00, 00, 01, 21, a8, f2, 74, 69, 14, 54, 4d, c5, 8b, e8, 42, 52, ac, 80, 53, b4, 4d, 24, 1f, 6c,...]
------------------------------------
Input AVPacket pts: 351459222
Input AVPacket data length: 2121
Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, a8, f3, 74, e9, 0b, 8b, 17, e8, 43, f8, 10, 88, ca, 2b, 11, 53, c8, 31, f0, 0b,...]
... on and onAsus Zenfone (Android 5.0.2) output thread (after decoding, strange results with 25 buffers of only 8 byte data) :
Output info: size=8, presentationTimeUs=-80001,offset=0,flags=0
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=0,offset=0,flags=1
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=720000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]
---------------------------
Output info: size=8, presentationTimeUs=780000,offset=0,flags=1
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=840000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=960000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 3f, 8b, ac]
---------------------------
Output info: size=8, presentationTimeUs=1040000,offset=0,flags=1
Output data:
[01, 00, 00, 00, f8, 76, 85, ac]
---------------------------
Output info: size=8, presentationTimeUs=1180000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=1260000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, b5, d2, ac]
---------------------------
Output info: size=8, presentationTimeUs=1800000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=1860000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, c0, 84, ac]
---------------------------
Output info: size=8, presentationTimeUs=2080000,offset=0,flags=1
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=3440000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=3520000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=4160000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]
---------------------------
Output info: size=8, presentationTimeUs=4300000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 3f, 8b, ac]
---------------------------
Output info: size=8, presentationTimeUs=4400000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 90, c5, 99, ac]
---------------------------
Output info: size=8, presentationTimeUs=4480000,offset=0,flags=1
Output data:
[01, 00, 00, 00, f8, 76, 85, ac]
---------------------------
Output info: size=8, presentationTimeUs=4680000,offset=0,flags=0
Output data:
[01, 00, 00, 00, c0, cb, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=4720000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, c0, 84, ac]
---------------------------
Output info: size=8, presentationTimeUs=4760000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e0, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=4800000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 58, 54, 83, ac]
---------------------------
Output info: size=8, presentationTimeUs=5040000,offset=0,flags=0
Output data:
[01, 00, 00, 00, e8, b5, d2, ac]
---------------------------
Output info: size=8, presentationTimeUs=5100000,offset=0,flags=1
Output data:
[01, 00, 00, 00, 80, 87, 93, ac]
---------------------------
Output info: size=8, presentationTimeUs=5320000,offset=0,flags=0
Output data:
[01, 00, 00, 00, 78, ea, 86, ac]
---------------------------
Output info: size=8, presentationTimeUs=5380000,offset=0,flags=1
Output data:
[01, 00, 00, 00, e8, 86, b6, ac]Other Asus Zenfone logs :
01-25 17:11:36.859 4851-4934/com.example.app I/OMXClient: Using client-side OMX mux.
01-25 17:11:36.865 317-1075/? I/OMX-VDEC-1080P: component_init: OMX.qcom.video.decoder.avc : fd=43
01-25 17:11:36.867 317-1075/? I/OMX-VDEC-1080P: Capabilities: driver_name = msm_vidc_driver, card = msm_vdec_8974, bus_info = , version = 1, capabilities = 4003000
01-25 17:11:36.881 317-1075/? I/OMX-VDEC-1080P: omx_vdec::component_init() success : fd=43
01-25 17:11:36.885 4851-4934/com.example.app I/ACodec: [OMX.qcom.video.decoder.avc] DRC Mode: Dynamic Buffer Mode
01-25 17:11:36.893 317-20612/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.935 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.957 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.957 4851-4934/com.example.app I/ExtendedCodec: Decoder will be in frame by frame mode
01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
01-25 17:11:36.964 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
01-25 17:11:37.072 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
01-25 17:11:37.072 4851-4934/com.example.app W/ACodec: do not know color format 0x7fa30c04 = 2141391876Asus Nexus 7 (Android 6.0.1) crashes :
01-25 17:23:06.921 11602-11695/com.example.app I/OMXClient: Using client-side OMX mux.
01-25 17:23:06.952 11602-11694/com.example.app I/MediaCodec: [OMX.qcom.video.decoder.avc] setting surface generation to 11880449
01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeANWBufferInMetadata not implemented
01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeMetaDataInBuffers not implemented
01-25 17:23:06.954 194-194/? E/OMXNodeInstance: getExtensionIndex(45:qcom.decoder.avc, OMX.google.android.index.storeMetaDataInBuffers) ERROR: NotImplemented(0x80001006)
01-25 17:23:06.954 11602-11695/com.example.app E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
01-25 17:23:06.963 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
01-25 17:23:06.967 194-604/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
01-25 17:23:07.203 11602-11695/com.example.app W/AHierarchicalStateMachine: Warning message AMessage(what = 'omxI') = {
int32_t type = 0
int32_t event = 2130706432
int32_t data1 = 1
int32_t data2 = 0
} unhandled in root state.
01-25 17:23:07.232 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
01-25 17:23:07.241 194-194/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
01-25 17:23:07.242 194-194/? E/OMX-VDEC-1080P: Insufficient sized buffer given for playback, expected 671744, got 663552
01-25 17:23:07.242 194-194/? E/OMXNodeInstance: useBuffer(45:qcom.decoder.avc, Output:1 671744@0xb60a0860) ERROR: BadParameter(0x80001005)
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: registering GraphicBuffer 0 with OMX IL component failed: -2147483648
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: Failed to allocate output port buffers after port reconfiguration: (-2147483648)
01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
01-25 17:23:07.243 11602-11694/com.example.app E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: java.lang.IllegalStateException
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.FfmpegPlayer$OutputThread.run(FfmpegPlayer.java:122)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.FfmpegPlayer.setDataSource(FfmpegPlayer.java:66)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at com.example.app.activities.TestActivity$2.surfaceCreated(TestActivity.java:151)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.view.SurfaceView.updateWindow(SurfaceView.java:583)
01-25 17:23:07.245 11602-11602/com.example.app W/System.err: at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:177)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:944)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2055)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1107)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6013)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer$CallbackRecord.run(Choreographer.java:858)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer.doCallbacks(Choreographer.java:670)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer.doFrame(Choreographer.java:606)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:844)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Handler.handleCallback(Handler.java:739)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Handler.dispatchMessage(Handler.java:95)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.os.Looper.loop(Looper.java:148)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at android.app.ActivityThread.main(ActivityThread.java:5417)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at java.lang.reflect.Method.invoke(Native Method)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
01-25 17:23:07.246 11602-11602/com.example.app W/System.err: at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)Another device always has empty output buffers, thought the indexes aren >= 0 ;
What am I doing wrong ?
- On some devices
-
ffmpeg doesn't send data through udp
5 avril 2018, par ofir dubiIm using ffmpeg to stream my desktop (the server) to another computer (the client)
server command :
ffmpeg -f gdigrab -i desktop -f mpegts udp:1.2.3.4:1234
client command :
ffplay -f mpegts udp://4.3.2.1:1234
when I run the server and the client on the same computer (using ip 127.0.0.1) it works and I can see my desktop. but when i try to send it to another computer it fails.
I checked my network traffic, and apparently the server isn’t sending any data.
So why is the data not sent ?