
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (65)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (8795)
-
Unplayable video after running FFmpeg command
25 mai 2020, par HB.I asked this question last year. I resolved the issue I had and I implemented the same logic for merging an image with a video, instead of two images. This is running on Android.



Here is the command I'm using currently :



"-i", mFilePath, "-i", drawingPath, "-filter_complex", "[0:v]scale=iw*sar:ih,setsar=1,pad='max(iw\\,2*trunc(ih*47/80/2))':'max(ih\\,2*trunc(ow*80/47/2))':(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-map", "0:a", "-c:v", "libx264", "-preset", "ultrafast", "-r", outputFPS, outputPath}




47/80/2
is calculated by getting a device's screen dimensions -1128 x 1920
.


When running this on certain devices, it results in an unplayable video.



But running the following command works perfectly fine :



"-i", mFilePath, "-crf", "18", "-c:v", "libx264", "-preset", "ultrafast", outputPath};




I think the issue is with the filter being applied ?





I compared running the first command on two different devices.



- 

- On the first device (Samsung J7 Pro), I was able to run the command successfully and play the video afterward. I tested the output on both devices and it is working.
- On the second device (Sony Xperia Tablet Z), I was able to run the command successfully but could not play the video. I tested the output on both devices and it doesn't play on either. It does play on my computer.







I compared the original video with the one not working and the one without a filter and the only difference I could find is that the one that is not working profile is
Baseline@L4.2
and the one without a filter profile isBaseline@L4.0
. The original video profile isHigh@L4.0
.


Here are all the videos. The original, the one without a filter (working) and the one with the filter(no working).



I have no idea why this is happening ? Any help would be appreciated.





Edit 1 :



Here is the actual log as requested :



"-i", "/storage/emulated/0/Android/data/com.my.package/files/CameraTemp/2020_05_24_09_17_53.mp4", "-i", "/storage/emulated/0/Android/data/com.my.package/files/MyVideos/tempShapes.png", "-filter_complex", "[0:v]scale=iw*sar:ih,setsar=1,pad='max(iw\\,2*trunc(ih*47/80/2))':'max(ih\\,2*trunc(ow*80/47/2))':(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-map", "0:a", "-c:v", "libx264", "-preset", "ultrafast", "-r", "30", "/storage/emulated/0/Android/data/com.my.package/files/MyVideos/video with line.mp4"




and here is the complete log :



ffmpeg version n4.0-39-gda39990 Copyright (c) 2000-2018 the FFmpeg developers
 built with gcc 4.9.x (GCC) 20150123 (prerelease)
 configuration: --target-os=linux --cross-prefix=/root/bravobit/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/root/bravobit/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-ffprobe --enable-libopus --enable-libvorbis --enable-libfdk-aac --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-libvpx --enable-libass --enable-yasm --enable-pthreads --disable-debug --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-linux-perf --disable-doc --disable-shared --enable-static --enable-runtime-cpudetect --enable-nonfree --enable-network --enable-avresample --enable-avformat --enable-avcodec --enable-indev=lavfi --enable-hwaccels --enable-ffmpeg --enable-zlib --enable-gpl --enable-small --enable-nonfree --pkg-config=pkg-config --pkg-config-flags=--static --prefix=/root/bravobit/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/root/bravobit/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/root/bravobit/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-cxxflags=
 libavutil 56. 14.100 / 56. 14.100
 libavcodec 58. 18.100 / 58. 18.100
 libavformat 58. 12.100 / 58. 12.100
 libavdevice 58. 3.100 / 58. 3.100
 libavfilter 7. 16.100 / 7. 16.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 1.100 / 5. 1.100
 libswresample 3. 1.100 / 3. 1.100
 libpostproc 55. 1.100 / 55. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Android/data/com.my.package/files/CameraTemp/2020_05_24_09_17_53.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2020-05-24T08:18:02.000000Z
 Duration: 00:00:01.64, start: 0.000000, bitrate: 20750 kb/s
 Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p, 1920x1080, 18056 kb/s, SAR 1:1 DAR 16:9, 29.70 fps, 29.67 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2020-05-24T08:18:02.000000Z
 handler_name : VideoHandle
 Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 155 kb/s (default)
 Metadata:
 creation_time : 2020-05-24T08:18:02.000000Z
 handler_name : SoundHandle
Input #1, png_pipe, from '/storage/emulated/0/Android/data/com.my.package/files/MyVideos/tempShapes.png':
 Duration: N/A, bitrate: N/A
 Stream #1:0: Video: png, rgba(pc), 1920x1128, 25 tbr, 25 tbn, 25 tbc
Stream mapping:
 Stream #0:0 (h264) -> scale (graph 0)
 Stream #1:0 (png) -> scale2ref:default (graph 0)
 overlay (graph 0) -> Stream #0:0 (libx264)
 Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
frame= 0 fps=0.0 q=0.0 size= 0kB time=-577014:32:22.77 bitrate= -0.0kbits/s speed=N/A 
frame= 0 fps=0.0 q=0.0 size= 0kB time=-577014:32:22.77 bitrate= -0.0kbits/s speed=N/A 
[libx264 @ 0xb83fc8a0] using SAR=1/1
[libx264 @ 0xb83fc8a0] using cpu capabilities: ARMv6 NEON
[libx264 @ 0xb83fc8a0] profile Constrained Baseline, level 4.2
[libx264 @ 0xb83fc8a0] 264 - core 152 r2851M ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=2 keyint_min=1 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
Output #0, mp4, to '/storage/emulated/0/Android/data/com.my.package/files/MyVideos/video with line.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 encoder : Lavf58.12.100
 Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1920x1128 [SAR 1:1 DAR 80:47], q=-1--1, 29 fps, 14848 tbn, 29 tbc (default)
 Metadata:
 encoder : Lavc58.18.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
 Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 creation_time : 2020-05-24T08:18:02.000000Z
 handler_name : SoundHandle
 encoder : Lavc58.18.100 aac
frame= 1 fps=0.4 q=0.0 size= 0kB time=00:00:01.01 bitrate= 0.4kbits/s speed=0.397x 
frame= 5 fps=1.6 q=0.0 size= 0kB time=00:00:01.01 bitrate= 0.4kbits/s speed=0.33x 
frame= 9 fps=2.5 q=24.0 size= 256kB time=00:00:01.01 bitrate=2075.0kbits/s speed=0.28x 
frame= 13 fps=3.1 q=25.0 size= 1024kB time=00:00:01.01 bitrate=8298.9kbits/s speed=0.243x 
frame= 18 fps=3.8 q=29.0 size= 2048kB time=00:00:01.01 bitrate=16597.5kbits/s speed=0.214x 
frame= 21 fps=3.9 q=25.0 size= 2560kB time=00:00:01.01 bitrate=20746.7kbits/s speed=0.19x 
frame= 23 fps=3.9 q=25.0 size= 2816kB time=00:00:01.01 bitrate=22821.4kbits/s speed=0.173x 
frame= 26 fps=4.0 q=29.0 size= 3584kB time=00:00:01.01 bitrate=29045.3kbits/s speed=0.156x 
Past duration 0.617577 too large
Past duration 0.639641 too large
frame= 28 fps=3.9 q=29.0 size= 3840kB time=00:00:01.01 bitrate=31119.9kbits/s speed=0.142x 
Past duration 0.665230 too large
frame= 29 fps=3.8 q=25.0 size= 3840kB time=00:00:01.01 bitrate=31119.9kbits/s speed=0.132x 
Past duration 0.690834 too large
Past duration 0.711281 too large
Past duration 0.736885 too large
frame= 32 fps=3.9 q=29.0 size= 4608kB time=00:00:01.01 bitrate=37343.8kbits/s speed=0.123x 
Past duration 0.762474 too large
Past duration 0.783577 too large
Past duration 0.807564 too large
frame= 35 fps=3.9 q=25.0 size= 4864kB time=00:00:01.01 bitrate=39418.4kbits/s speed=0.112x 
Past duration 0.831551 too large
Past duration 0.855537 too large
frame= 37 fps=3.5 q=25.0 size= 5376kB time=00:00:01.01 bitrate=43567.7kbits/s speed=0.0968x 
Past duration 0.879524 too large
Past duration 0.903511 too large
frame= 39 fps=3.4 q=25.0 size= 5376kB time=00:00:01.06 bitrate=41196.6kbits/s speed=0.0927x 
Past duration 0.927498 too large
Past duration 0.951500 too large
frame= 41 fps=3.4 q=25.0 size= 5376kB time=00:00:01.13 bitrate=38700.0kbits/s speed=0.0931x 
frame= 41 fps=3.2 q=25.0 size= 5376kB time=00:00:01.13 bitrate=38700.0kbits/s speed=0.0886x 
frame= 41 fps=3.1 q=25.0 size= 5888kB time=00:00:01.43 bitrate=33554.2kbits/s speed=0.108x 
Past duration 0.975487 too large
frame= 45 fps=3.2 q=26.0 size= 6656kB time=00:00:01.60 bitrate=33905.4kbits/s speed=0.114x 
frame= 45 fps=3.0 q=-1.0 Lsize= 8158kB time=00:00:01.65 bitrate=40480.7kbits/s speed=0.11x 
video:8127kB audio:28kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.032895%
[libx264 @ 0xb83fc8a0] frame I:23 Avg QP:24.70 size:337646
[libx264 @ 0xb83fc8a0] frame P:22 Avg QP:29.00 size: 25250
[libx264 @ 0xb83fc8a0] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0xb83fc8a0] mb P I16..4: 0.4% 0.0% 0.0% P16..4: 43.6% 0.0% 0.0% 0.0% 0.0% skip:56.0%
[libx264 @ 0xb83fc8a0] coded y,uvDC,uvAC intra: 90.0% 84.7% 58.1% inter: 20.1% 6.2% 0.1%
[libx264 @ 0xb83fc8a0] i16 v,h,dc,p: 25% 28% 28% 20%
[libx264 @ 0xb83fc8a0] i8c dc,h,v,p: 39% 25% 20% 16%
[libx264 @ 0xb83fc8a0] kb/s:42901.20
[aac @ 0xb83d7d10] Qavg: 3517.779



-
FFmpeg on android is crashing in avcodec_decode_video2 function
6 juin 2015, par Matt WolfeFFmpeg is crashing on : libavcodec/utils.c avcodec_decode_video2 around line 2400
ret = avctx->codec->decode(avctx, picture, got_picture_ptr, &tmp);
So I’ve compiled ffmpeg on android using the following configure script (based from here ) :
prefix=${src_root}/ffmpeg/android/arm
addi_cflags="-marm -Os -fpic"
addi_ldflags=""
./configure \
--prefix=${prefix} \
--target-os=linux \
--arch=arm \
--enable-shared \
--disable-doc \
--disable-programs \
--disable-symver \
--cross-prefix=${TOOLCHAIN}/bin/arm-linux-androideabi- \
--enable-cross-compile \
--enable-decoder=aac \
--enable-decoder=mpeg4 \
--enable-decoder=h263 \
--enable-decoder=flv \
--enable-decoder=mpegvideo \
--enable-decoder=mpeg2video \
--sysroot=${SYSROOT} \
--extra-cflags="${addi_cflags}" \
--pkg-config=$(which pkg-config) >> ${build_log} 2>&1 || die "Couldn't configure ffmpeg"The *.so files get copied over into my projects which I reference from my Android.mk script :
LOCAL_PATH := $(call my-dir)
FFMPEG_PATH=/path/to/android-ffmpeg-with-rtmp/build/dist
include $(CLEAR_VARS)
LOCAL_MODULE := libavcodec
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libavcodec-56.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavdevice
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libavdevice-56.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavfilter
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libavfilter-5.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavformat
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libavformat-56.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavutil
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libavutil-54.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libswresample
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libswresample-1.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libswscale
LOCAL_SRC_FILES :=$(FFMPEG_PATH)/lib/libswscale-3.so
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_LDLIBS := -llog
LOCAL_C_INCLUDES := $(FFMPEG_PATH)/include
#LOCAL_PRELINK_MODULE := false
LOCAL_MODULE := axonffmpeg
LOCAL_SRC_FILES := libffmpeg.c
LOCAL_CFLAGS := -g
LOCAL_SHARED_LIBRARIES := libavcodec libavdevice libavfilter libavformat libavutil libswresample libswscale
include $(BUILD_SHARED_LIBRARY)I’m building a little wrapper to decode frames (mpeg4 video,part 2 simple profile) that come from an external camera :
#include
#include
#include <android></android>log.h>
#include <libavutil></libavutil>opt.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavutil></libavutil>channel_layout.h>
#include <libavutil></libavutil>common.h>
#include <libavutil></libavutil>imgutils.h>
#include <libavutil></libavutil>mathematics.h>
#include <libavutil></libavutil>samplefmt.h>
#define DEBUG_TAG "LibFFMpeg:NDK"
AVCodec *codec;
AVFrame *current_frame;
AVCodecContext *context;
int resWidth, resHeight, bitRate;
void my_log_callback(void *ptr, int level, const char *fmt, va_list vargs);
jint Java_com_mycompany_axonv2_LibFFMpeg_initDecoder(JNIEnv * env, jobject this,
jint _resWidth, jint _resHeight, jint _bitRate)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "initDecoder called");
int len;
resWidth = _resWidth;
resHeight = _resHeight;
bitRate = _bitRate;
av_log_set_callback(my_log_callback);
av_log_set_level(AV_LOG_VERBOSE);
avcodec_register_all();
codec = avcodec_find_encoder(AV_CODEC_ID_MPEG4);
if (!codec) {
__android_log_print(ANDROID_LOG_ERROR, DEBUG_TAG, "codec %d not found", AV_CODEC_ID_MPEG4);
return -1;
}
context = avcodec_alloc_context3(codec);
if (!context) {
__android_log_print(ANDROID_LOG_ERROR, DEBUG_TAG, "Could not allocate codec context");
return -1;
}
context->width = resWidth;
context->height = resHeight;
context->bit_rate = bitRate;
context->pix_fmt = AV_PIX_FMT_YUV420P;
context->time_base.den = 6;
context->time_base.num = 1;
int openRet = avcodec_open2(context, codec, NULL);
if (openRet < 0) {
__android_log_print(ANDROID_LOG_ERROR, DEBUG_TAG, "Could not open codec, error:%d", openRet);
return -1;
}
current_frame = av_frame_alloc();
if (!current_frame) {
__android_log_print(ANDROID_LOG_ERROR, DEBUG_TAG, "Could not allocate video frame");
return -1;
}
return 0;
}
void my_log_callback(void *ptr, int level, const char *fmt, va_list vargs) {
__android_log_print (level, DEBUG_TAG, fmt, vargs);
}
jint Java_com_mycompany_axonv2_LibFFMpeg_queueFrameForDecoding(JNIEnv * env, jobject this,
jlong pts, jbyteArray jBuffer)
{
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "queueFrameForDecoding called");
AVPacket avpkt;
av_init_packet(&avpkt);
int buffer_len = (*env)->GetArrayLength(env, jBuffer);
uint8_t* buffer = (uint8_t *) (*env)->GetByteArrayElements(env, jBuffer,0);
int got_frame = 0;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "copied %d bytes into uint8_t* buffer", buffer_len);
av_packet_from_data(&avpkt, buffer, buffer_len);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "av_packet_from_data called");
avpkt.pts = pts;
int ret = avcodec_decode_video2(context, current_frame, &got_frame, &avpkt);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "avcodec_decode_video2 returned %d" , ret);
(*env)->ReleaseByteArrayElements(env, jBuffer, (jbyte*) buffer, 0);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "ReleaseByteArrayElements()");
return 0;
}Alright so the init function above works fine and the queueFrameForDecoding works up until the avcodec_decode_video2 function. I’m not expecting it to work just quite yet however as I’ve been logging output as to where we get in that function, I’ve found that there is a call (in avutil.c) :
(around line 2400 in the latest code)avcodec_decode_video2(...) {
....
ret = avctx->codec->decode(avctx, picture, got_picture_ptr, &tmp);init runs fine and finds the codec and all that. Everything works great up until the avcodec_decode_video2 call :
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'samsung/klteuc/klteatt:4.4.2/KOT49H/G900AUCU2ANG3:user/release-keys'
Revision: '14'
pid: 19355, tid: 22584, name: BluetoothReadTh >>> com.mycompany.axonv2 <<<
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 00000000
r0 79308400 r1 79491710 r2 7b0b4a70 r3 7b0b49e8
r4 79308400 r5 79491710 r6 00000000 r7 7b0b49e8
r8 7b0b4a70 r9 7b0b4a80 sl 795106d8 fp 00000000
ip 00000000 sp 7b0b49b8 lr 7ba05c18 pc 00000000 cpsr 600f0010
d0 206c616768616c62 d1 6564206365646f63
d2 756f722065646f63 d3 20736920656e6974
d4 0b0a01000a0a0a0b d5 0a630a01000a0a0a
d6 0a630a011a00f80a d7 0b130a011a00f90a
d8 0000000000000000 d9 0000000000000000
d10 0000000000000000 d11 0000000000000000
d12 0000000000000000 d13 0000000000000000
d14 0000000000000000 d15 0000000000000000
d16 6369705f746f6720 d17 7274705f65727574
d18 8000000000000000 d19 00000b9e42bd5730
d20 0000000000000000 d21 0000000000000000
d22 7b4fd10400000000 d23 773b894877483b68
d24 0000000000000000 d25 3fc2f112df3e5244
d26 40026bb1bbb55516 d27 0000000000000000
d28 0000000000000000 d29 0000000000000000
d30 0000000000000000 d31 0000000000000000
scr 60000010
backtrace:
#00 pc 00000000 <unknown>
#01 pc 00635c14 /data/app-lib/com.mycompany.axonv2-6/libavcodec-56.so (avcodec_decode_video2+1128)
</unknown>I don’t understand why it’s crashing when trying to call the decode function. I’ve looked into the codec function pointer list and this should be calling ff_h263_decode_frame (source, libavcodec/mpeg4videodec.c) :
AVCodec ff_mpeg4_decoder = {
.name = "mpeg4",
.long_name = NULL_IF_CONFIG_SMALL("MPEG-4 part 2"),
.type = AVMEDIA_TYPE_VIDEO,
.id = AV_CODEC_ID_MPEG4,
.priv_data_size = sizeof(Mpeg4DecContext),
.init = decode_init,
.close = ff_h263_decode_end,
.decode = ff_h263_decode_frame,
.capabilities = CODEC_CAP_DRAW_HORIZ_BAND | CODEC_CAP_DR1 |
CODEC_CAP_TRUNCATED | CODEC_CAP_DELAY |
CODEC_CAP_FRAME_THREADS,
.flush = ff_mpeg_flush,
.max_lowres = 3,
.pix_fmts = ff_h263_hwaccel_pixfmt_list_420,
.profiles = NULL_IF_CONFIG_SMALL(mpeg4_video_profiles),
.update_thread_context = ONLY_IF_THREADS_ENABLED(mpeg4_update_thread_context),
.priv_class = &mpeg4_class,
};I know that the ff_h263_decode_frame function isn’t being called because I added logging to it and none of that gets printed.
However, if I just call ff_h263_decode_frame directly from avcodec_decode_video2 then my logging gets output. I don’t want to call this function directly though and would rather get the ffmpeg framework working correctly. Is there something wrong with how I’ve configured ffmpeg ? I have added mpegvideo, mpeg2video, flv, h263, to the configure script but none have them have helped (they should be included automatically by —enable-decoder=mpeg4).Any help would be greatly appreciated.
-
Files created with "ffmpeg hevc_nvenc" do not play on TV. (with video codec SDK 9.1 of nvidia)
29 janvier 2020, par DashhhProblem
- Files created with hevc_nvenc do not play on TV. (samsung smart tv, model unknown)
Related to my ffmpeg build is below.
FFmpeg build conf
$ ffmpeg -buildconf
--enable-cuda
--enable-cuvid
--enable-nvenc
--enable-nonfree
--enable-libnpp
--extra-cflags=-I/path/cuda/include
--extra-ldflags=-L/path/cuda/lib64
--prefix=/prefix/ffmpeg_build
--pkg-config-flags=--static
--extra-libs='-lpthread -lm'
--extra-cflags=-I/prefix/ffmpeg_build/include
--extra-ldflags=-L/prefix/ffmpeg_build/lib
--enable-gpl
--enable-nonfree
--enable-version3
--disable-stripping
--enable-avisynth
--enable-libass
--enable-libfontconfig
--enable-libfreetype
--enable-libfribidi
--enable-libgme
--enable-libgsm
--enable-librubberband
--enable-libshine
--enable-libsnappy
--enable-libssh
--enable-libtwolame
--enable-libwavpack
--enable-libzvbi
--enable-openal
--enable-sdl2
--enable-libdrm
--enable-frei0r
--enable-ladspa
--enable-libpulse
--enable-libsoxr
--enable-libspeex
--enable-avfilter
--enable-postproc
--enable-pthreads
--enable-libfdk-aac
--enable-libmp3lame
--enable-libopus
--enable-libtheora
--enable-libvorbis
--enable-libvpx
--enable-libx264
--enable-libx265
--disable-ffplay
--enable-libopenjpeg
--enable-libwebp
--enable-libxvid
--enable-libvidstab
--enable-libopenh264
--enable-zlib
--enable-opensslffmpeg Command
- Command about FFmpeg encoding
ffmpeg -ss 1800 -vsync 0 -hwaccel cuvid -hwaccel_device 0 \
-c:v h264_cuvid -i /data/input.mp4 -t 10 \
-filter_complex "\
[0:v]hwdownload,format=nv12,format=yuv420p,\
scale=iw*2:ih*2" -gpu 0 -c:v hevc_nvenc -pix_fmt yuv444p16le -preset slow -rc cbr_hq -b:v 5000k -maxrate 7000k -bufsize 1000k -acodec aac -ac 2 -dts_delta_threshold 1000 -ab 128k -flags global_header ./makevideo_nvenc_hevc.mp4Full log about This Command - check this full log
The reason for adding "-color_ " in the command is as follows.
- HDR video after creating bt2020 + smpte2084 video using nvidia hardware accelerator. (I’m studying to make HDR videos. I’m not sure if this is right.)
How can I make a video using ffmpeg hevc_nvenc and have it play on TV ?
Things i’ve done
Here’s what I’ve researched about why it doesn’t work.
The header information is not properly included in the resulting video file. So I used a program called nvhsp to add SEI and VUI information inside the video. See below for the commands and logs used.
nvhsp
is open source for writing VUI and SEI bitstrings in raw video. nvhsp link# make rawvideo for nvhsp
$ ffmpeg -vsync 0 -hwaccel cuvid -hwaccel_device 0 -c:v h264_cuvid \
-i /data/input.mp4 -t 10 \
-filter_complex "[0:v]hwdownload,format=nv12,\
format=yuv420p,scale=iw*2:ih*2" \
-gpu 0 -c:v hevc_nvenc -f rawvideo output_for_nvhsp.265
# use nvhsp
$ python nvhsp.py ./output_for_nvhsp.265 -colorprim bt2020 \
-transfer smpte-st-2084 -colormatrix bt2020nc \
-maxcll "1000,300" -videoformat ntsc -full_range tv \
-masterdisplay "G (13250,34500) B (7500,3000 ) R (34000,16000) WP (15635,16450) L (10000000,1)" \
./after_nvhsp_proc_output.265
Parsing the infile:
==========================
Prepending SEI data
Starting new SEI NALu ...
SEI message with MaxCLL = 1000 and MaxFall = 300 created in SEI NAL
SEI message Mastering Display Data G (13250,34500) B (7500,3000) R (34000,16000) WP (15635,16450) L (10000000,1) created in SEI NAL
Looking for SPS ......... [232, 22703552]
SPS_Nals_addresses [232, 22703552]
SPS NAL Size 488
Starting reading SPS NAL contents
Reading of SPS NAL finished. Read 448 of SPS NALu data.
Making modified SPS NALu ...
Made modified SPS NALu-OK
New SEI prepended
Writing new stream ...
Progress: 100%
=====================
Done!
File nvhsp_after_output.mp4 created.
# after process
$ ffmpeg -y -f rawvideo -r 25 -s 3840x2160 -pix_fmt yuv444p16le -color_primaries bt2020 -color_trc smpte2084 -colorspace bt2020nc -color_range tv -i ./1/after_nvhsp_proc_output.265 -vcodec copy ./1/result.mp4 -hide_banner
Truncating packet of size 49766400 to 3260044
[rawvideo @ 0x40a6400] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from './1/nvhsp_after_output.265':
Duration: N/A, start: 0.000000, bitrate: 9953280 kb/s
Stream #0:0: Video: rawvideo (Y3[0][16] / 0x10003359), yuv444p16le(tv, bt2020nc/bt2020/smpte2084), 3840x2160, 9953280 kb/s, 25 tbr, 25 tbn, 25 tbc
[mp4 @ 0x40b0440] Could not find tag for codec rawvideo in stream #0, codec not currently supported in container
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Last message repeated 1 timesGoal
-
I want to generate matadata normally when encoding a video through hevc_nvenc.
-
I want to create a video through hevc_nvenc and play HDR Video on smart tv with 10bit color depth support.
Additional
-
Is it normal for ffmpeg hevc_nvenc not to generate metadata in the resulting video file ? or is it a bug ?
-
Please refer to the image below. (*’알 수 없음’ meaning ’unknown’)
- if you need more detail file info, check this Gist Link (by ffprobe)
- if you need more detail file info, check this Gist Link (by ffprobe)
-
However, if you encode a file in libx265, the attribute information is entered correctly as shown below.
- if you need more detail file info, check this Gist Link
- if you need more detail file info, check this Gist Link
However, when using hevc_nvenc, all information is missing.
- i used option
-show_streams -show_programs -show_format -show_data -of json -show_frames -show_log 56
at ffprobe
- Files created with hevc_nvenc do not play on TV. (samsung smart tv, model unknown)