
Recherche avancée
Médias (91)
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Lights in the Sky
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Head Down
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Echoplex
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Discipline
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Letting You
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (102)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (13321)
-
FFmpeg : Could not get frame filename with strftime
7 décembre 2016, par clicI want to have a timestamp in my JPEG files generated by FFmpeg.
I’m getting the above error message when executing FFmpeg like this :ffmpeg -f dshow -framerate 50 -i video="XI100DUSB-HDMI Video" -strftime 1 "%Y-%m-%d_%H-%M-%S_thumb%04d.jpg"
It also says :
av_interleaved_write_frame() : Invalid argument
What am I doing wrong ?
Thanks !Complete FFmpeg output :
ffmpeg version 3.2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.4.0 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-libebur128 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-li
bopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
libavutil 55. 34.100 / 55. 34.100
libavcodec 57. 64.100 / 57. 64.100
libavformat 57. 56.100 / 57. 56.100
libavdevice 57. 1.100 / 57. 1.100
libavfilter 6. 65.100 / 6. 65.100
libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100
libpostproc 54. 1.100 / 54. 1.100
Input #0, dshow, from 'video=XI100DUSB-HDMI Video':
Duration: N/A, start: 777552.702000, bitrate: N/A
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1920x1080, 50 fps, 50 tbr, 10000k tbn, 10000k tbc
[swscaler @ 0000000001c8c940] deprecated pixel format used, make sure you did set range correctly
[mjpeg @ 0000000001c7ea60] removing common factors from framerate
Output #0, image2, to '%Y-%m-%d_%H-%M-%S_thumb%04d.jpg':
Metadata:
encoder : Lavf57.56.100
Stream #0:0: Video: mjpeg, yuvj422p(pc), 1920x1080, q=2-31, 200 kb/s, 50 fps, 50 tbn, 50 tbc
Metadata:
encoder : Lavc57.64.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[image2 @ 0000000001c7afe0] Could not get frame filename with strftime
av_interleaved_write_frame(): Invalid argument
frame= 1 fps=0.0 q=7.9 Lsize=N/A time=00:00:00.02 bitrate=N/A speed=0.687x
video:101kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[dshow @ 0000000001c728a0] real-time buffer [XI100DUSB-HDMI Video] [video input] too full or near too full (136% of size: 3041280 [rtbufsize parameter])! frame dropped!
Conversion failed! -
FFmpeg "no frame !" encoding error
9 juillet 2015, par oleg.semenI’m trying to compress video by scaling it. Here is how
But I’m getting this error :D/FFMpeg﹕ progress[h264 @ 0x42124970] no frame!
D/FFMpeg﹕ progress[aac @ 0x42122fe0] Input buffer exhausted before END element foundhere is whole log :
Loading FFmpeg for armv7-neon CPU
start
Running publishing updates method
progressWARNING: linker: /data/data/com.example.ffmpeg/files/ffmpeg has text relocations. This is wasting memory and is a security risk. Please fix.
progressffmpeg version n2.4.2 Copyright (c) 2000-2014 the FFmpeg developers
progress built on Oct 7 2014 15:08:46 with gcc 4.8 (GCC)
progress configuration: --target-os=linux --cross-prefix=/home/sb/Source-Code/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/sb/Source-Code/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/sb/Source-Code/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/sb/Source-Code/ffmpeg-android/build/armeabi-v7a-neon --extra-cflags='-I/home/sb/Source-Code/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all -mfpu=neon' --extra-ldflags='-L/home/sb/Source-Code/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
progress libavutil 54. 7.100 / 54. 7.100
progress libavcodec 56. 1.100 / 56. 1.100
progress libavformat 56. 4.101 / 56. 4.101
progress libavdevice 56. 0.100 / 56. 0.100
progress libavfilter 5. 1.100 / 5. 1.100
progress libswscale 3. 0.100 / 3. 0.100
progress libswresample 1. 1.100 / 1. 1.100
progress libpostproc 53. 0.100 / 53. 0.100
progress[h264 @ 0x42124970] no frame!
progress[aac @ 0x42122fe0] Input buffer exhausted before END element found
progressInput #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/Movies/Instagram/VID_37551017_035953.mp4':
progress Metadata:
progress major_brand : isom
progress minor_version : 0
progress compatible_brands: isom3gp4
progress creation_time : 2015-06-19 09:03:19
progress Duration: 00:00:03.20, start: 0.000000, bitrate: 2975 kb/s
progress Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x640, 2911 kb/s, 15.22 fps, 14.92 tbr, 90k tbn, 180k tbc (default)
progress Metadata:
progress creation_time : 2015-06-19 09:03:19
progress handler_name : VideoHandle
progress Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, mono, fltp, 101 kb/s (default)
progress Metadata:
progress creation_time : 2015-06-19 09:03:19
progress handler_name : SoundHandleDid I miss something in config ?
Thanks. -
avformat/matroskaenc : Write duration early during mkv_write_header (Rev #3)
17 juillet 2016, par softworkzavformat/matroskaenc : Write duration early during mkv_write_header (Rev #3)
Rev #2 : Fixes doubled header writing, checked FATE running without errors
Rev #3 : Fixed coding styleThis commit addresses the following scenario :
we are using ffmpeg to transcode or remux mkv (or something else) to mkv. The result is being streamed on-the-fly to an HTML5 client (streaming starts while ffmpeg is still running). The problem here is that the client is unable to detect the duration because the duration is only written to the mkv at the end of the transcoding/remoxing process. In matroskaenc.c, the duration is only written during mkv_write_trailer but not during mkv_write_header.
The approach :
FFMPEG is currently putting quite some effort to estimate the durations of source streams, but in many cases the source stream durations are still left at 0 and these durations are nowhere mapped to or used for output streams. As much as I would have liked to deduct or estimate output durations based on input stream durations - I realized that this is a hard task (as Nicolas already mentioned in a previous conversation). It would involve changes to the duration calculation/estimation/deduction for input streams and propagating these durations to output streams or the output context in a correct way.
So I looked for a simple and small solution with better chances to get accepted. In webmdashenc.c I found that a duration is written during write_header and this duration is taken from the streams’ metadata, so I decided for a similar approach.And here’s what it does :
At first it is checking the duration of the AVFormatContext. In typical cases this value is not set, but : It is set in cases where the user has specified a recording_time or an end_time via the -t or -to parameters.
Then it is looking for a DURATION metadata field in the metadata of the output context (AVFormatContext::metadata). This would only exist in case the user has explicitly specified a metadata DURATION value from the command line.
Then it is iterating all streams looking for a "DURATION" metadata (this works unless the option "-map_metadata -1" has been specified) and determines the maximum value.
The precendence is as follows : 1. Use duration of AVFormatContext - 2. Use explicitly specified metadata duration value - 3. Use maximum (mapped) metadata duration over all streams.To test this :
1. With explicit recording time :
ffmpeg -i file :"src.mkv" -loglevel debug -t 01:38:36.000 -y "dest.mkv"2. Take duration from metadata specified via command line parameters :
ffmpeg -i file :"src.mkv" -loglevel debug -map_metadata -1 -metadata Duration="01:14:33.00" -y "dest.mkv"3. Take duration from mapped input metadata :
ffmpeg -i file :"src.mkv" -loglevel debug -y "dest.mkv"Regression risk :
Very low IMO because it only affects the header while ffmpeg is still running. When ffmpeg completes the process, the duration is rewritten to the header with the usual value (same like without this commit).
Signed-off-by : SoftWorkz <softworkz@hotmail.com>
Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>