
Recherche avancée
Autres articles (111)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (12744)
-
Merge commit '1329c08ad6d2ddb304858f2972c67b508e8b0f0e'
11 novembre 2017, par James Almer -
Trying to concat 2 videos using FFMPEG in Android command line
23 mars 2016, par Hero RomaI am trying to add a cross fade transitions between 2 videos in my Android application. I have used the Guardian Project to build the ffmpeg library locally. Guardian Project. I was able to do that.
Now I am trying to implement the cross fade using the StackOverflow answer
The code is basically something like
ArrayList cmd = new ArrayList() ;cmd.add(mFfmpegBin);
cmd.add("-i");
cmd.add(firstFile);
cmd.add("-i");
cmd.add(secondFile);
cmd.add("-f");
cmd.add("lavfi");
cmd.add("-i");
cmd.add("color=black");
cmd.add("-filter_complex");
cmd.add("[0:v]format=pix_fmts=yuva420p,fade=t=out:st=4:d=1:alpha=1,setpts=PTS-STARTPTS[va0];[1:v]format=pix_fmts=yuva420p,fade=t=in:st=0:d=1:alpha=1,setpts=PTS-STARTPTS+4/TB[va1];[2:v]scale=960x720,trim=duration=9[over];[over][va0]overlay[over1];[over1][va1]overlay=format=yuv420[outv]");
cmd.add("-vcodec");
cmd.add("libx264");
cmd.add("-map");
cmd.add("[outv]");
cmd.add(outputFile);
execFFMPEG(cmd, sc);The error I keep getting is
Invalid pixel format ’pix_fmts=yuva420p’
I feel I either the command is incorrect for command line usage or I am not providing the correct filters.
fc>/data/data/com.User.testapplication/app_bin/ffmpeg -i /storage/emulated/0/Pictures/DayVideos/FIRST_VIDEO.mp4 -i /storage/emulated/0/Pictures/DayVideos/SECOND_VIDEO.mp4 -f lavfi -i color=black -filter_complex [0:v]format=pix_fmts=yuva420p,fade=t=out:st=4:d=1:alpha=1,setpts=PTS-STARTPTS[va0];[1:v]format=pix_fmts=yuva420p,fade=t=in:st=0:d=1:alpha=1,setpts=PTS-STARTPTS+4/TB[va1];[2:v]scale=960x720,trim=duration=9[over];[over][va0]overlay[over1];[over1][va1]overlay=format=yuv420[outv] -vcodec libx264 -map [outv] /storage/emulated/0/Pictures/DayVideos/output.mp4
fc>WARNING: linker: /data/data/com.User.testapplication/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix.
fc>ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers
fc> built on Mar 23 2016 12:12:02 with gcc 4.6 20120106 (prerelease)
fc> configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/info.guardianproject.ffmpeg/app_opt --enable-pic --disable-shared --enable-static --cross-prefix=/home/test/Downloads/android-ndk-r8e/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --sysroot=/home/test/Downloads/android-ndk-r8e/platforms/android-14/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon -fPIE -pie' --extra-ldflags='-L../x264 -fPIE -pie' --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib --enable-muxer=md5
fc> libavutil 51. 54.100 / 51. 54.100
fc> libavcodec 54. 23.100 / 54. 23.100
fc> libavformat 54. 6.100 / 54. 6.100
fc> libavdevice 54. 0.100 / 54. 0.100
fc> libavfilter 2. 77.100 / 2. 77.100
fc> libswscale 2. 1.100 / 2. 1.100
fc> libswresample 0. 15.100 / 0. 15.100
fc> libpostproc 52. 0.100 / 52. 0.100
fc>Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/DayVideos/FIRST_VIDEO.mp4':
fc> Metadata:
fc> major_brand : isom
fc> minor_version : 0
fc> compatible_brands: isomiso2avc1
fc> creation_time : 2016-03-22 19:07:16
fc> Duration: 00:00:19.35, start: 0.000000, bitrate: 5253 kb/s
fc> Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 5252 kb/s, SAR 65536:65536 DAR 16:9, 30.02 fps, 30 tbr, 90k tbn, 180k tbc
fc> Metadata:
fc> creation_time : 2016-03-22 19:07:02
fc>Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Pictures/DayVideos/SECOND_VIDEO.mp4':
fc> Metadata:
fc> major_brand : mp42
fc> minor_version : 0
fc> compatible_brands: isommp42
fc> creation_time : 2016-03-22 19:07:55
fc> Duration: 00:00:17.32, start: 0.000000, bitrate: 5414 kb/s
fc> Stream #1:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 5227 kb/s, SAR 65536:65536 DAR 16:9, 30.02 fps, 30.02 tbr, 90k tbn, 180k tbc
fc> Metadata:
fc> creation_time : 2016-03-22 19:07:55
fc> handler_name : VideoHandle
fc>[color @ 0xb8901360] w:320 h:240 r:25/1 color:0x000000ff
fc>[lavfi @ 0xb88fbb60] Estimating duration from bitrate, this may be inaccurate
fc>Input #2, lavfi, from 'color=black':
fc> Duration: N/A, start: 0.000000, bitrate: N/A
fc> Stream #2:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
fc>[format @ 0xb8901190] Invalid pixel format 'pix_fmts=yuva420p'
fc>Error initializing filter 'format' with args 'pix_fmts=yuva420p'
fc>Error configuring filters.Thanks !
-
mpegts proxy allowing multiple concurrent views of same source feed
5 juillet 2018, par Yass TI am having trouble finding a (preferably non-commercial) linux compatible project that would allow to "restream" one particular MPEGTS live source feed to multiple clients, without requesting it more than once from the source.
Imagine an mpegts live source feed coming from http://my_mpegts_provider (imagine we have the direct link, no m3u style playlist), I would like a program to be able to grab that source and proxy it to one AND more client devices (let’s say VLC players) when they are requesting it, even if not requested at the same time.
In other words, can we proxy the MPEGTS stream so only our proxy will be seen as the "client" (from the source provider point of view) when multiple devices (on the same network) connect to the SAME source on this side ?
I tried the node-ffmpeg-mpegts-proxy but it clearly showed two separate connections to the source when asking for the same "channel" with two different VLC clients (confirmed with netstat).
Thanks, and sorry in advance if my post is out of stackoverflow rules, I can’t think of a better place to gather some good opinions