
Recherche avancée
Autres articles (18)
-
Déploiements possibles
31 janvier 2010, parDeux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
Version mono serveur
La version mono serveur consiste à n’utiliser qu’une (...) -
Taille des images et des logos définissables
9 février 2011, parDans beaucoup d’endroits du site, logos et images sont redimensionnées pour correspondre aux emplacements définis par les thèmes. L’ensemble des ces tailles pouvant changer d’un thème à un autre peuvent être définies directement dans le thème et éviter ainsi à l’utilisateur de devoir les configurer manuellement après avoir changé l’apparence de son site.
Ces tailles d’images sont également disponibles dans la configuration spécifique de MediaSPIP Core. La taille maximale du logo du site en pixels, on permet (...) -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.
Sur d’autres sites (6987)
-
FFmpeg for marking time video based on a reference date
3 août 2016, par Denio MarizI am trying to mark a timestamp in a video using Drawtext filter.
FFmpeg easily marks timestamps based on localtime, gmtime or even PTS. However, I want to assign a reference time (start time) for the timestamp in order to represent the time the video was recorded (not encoded).Reading the documentation, I found that option "basetime" can be used for this purpose. However it seems that is not working or I am missing something.
The command line I am using is :
ffmpeg -y -i input.mp4 -filter_complex drawtext="fontfile=/tmp/UbuntuMono-B.ttf: fontsize=36: fontcolor=yellow: box=1: boxcolor=black@0.4: text='Wall Clock Time\: %{gmtime\:%Y-%m-%d %T}': basetime=1456007118" output.mp4
By using basetime=1456007118", it was expected the start time was set to ’02/20/2016 20:25:18’ since 1456007118 is the UTC time for that time and date :
date -d '02/20/2016 20:25:18' +"%s" # format MM/DD/AAAA hh:mm:ss
1456007118However, no error is issued by FFMpeg and the video is marked with current GMT, ignoring "basetime" option.
Any hint ?
Thanks.Complete information about FFmpeg version and output is :
ffmpeg -y -i /home/denio/Videos/Interstellar_2014_Trailer_4_5.1-1080p-HDTN.mp4 -filter_complex drawtext="fontfile=/tmp/UbuntuMono-B.ttf: fontsize=36: fontcolor=yellow: box=1: boxcolor=black@0.4: text='Wall Clock Time\: %{gmtime\:%Y-%m-%d %T}': basetime=1470226363" /tmp/x.mp4
ffmpeg version 3.1.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-14ubuntu2.1) 20160413
configuration: --enable-libxavs --enable-bzlib --enable-libfaac --enable-libfreetype --enable-libfontconfig --enable-libmp3lame --enable-libschroedinger --enable-libspeex --enable-libvorbis --enable-libx264 --enable-libx265 --enable-libxvid --enable-zlib --enable-x11grab --enable-static --enable-pthreads --enable-gpl --enable-nonfree --enable-version3 --disable-ffserver --enable-libgsm --enable-librtmp --enable-libvpx --enable-libschroedinger --enable-libopencore-amrnb --enable-libopenjpeg
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 48.101 / 57. 48.101
libavformat 57. 41.100 / 57. 41.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 47.100 / 6. 47.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
...
... -
fate/apng : add test for apng decoding
2 juillet 2016, par Martin Vignali -
Errors when streaming h264 video from gstreamer to ffmpeg
14 juin 2016, par Michael NgHi I am trying to receive a udp/rtp stream with ffmpeg in the client side but is having trouble.
Server side pipeline :
gst-launch-1.0 -v filesrc location=video2.mp4 ! decodebin ! x264enc !
rtph264pay ! udpsink host=127.0.0.1 port=5006On the client side, I can play the video with the following pipeline :
gst-launch-1.0 -e udpsrc uri=udp://0.0.0.0:5006 ! application/x-rtp, clock-rate=90000, payload=96 ! rtph264depay ! decodebin ! autovideosink
However, since I want to convert the stream into a rtsp/http stream, I tried to receive the rtp stream with ffmpeg and perform something like :
ffmpeg -i udp://127.0.0.1:5006 -acodec copy -vcodec copy http://localhost:8090/feed1.ffm
But before doing that, I was testing this approach by saving the stream into a mp4 file with :
ffmpeg -f h264 -i udp://127.0.0.1:5006 -strict -2 -f mp4 stream.mp4
But this did not work, it gave me the following error :
missing picture in access unit with size 15019525 [h264 @ 0x11d5100]
no frame ! [h264 @ 0x11f06c0] decoding for stream 0 failed [h264 @
0x11f06c0] Could not find codec parameters for stream 0 (Video : h264) :
unspecified size Consider increasing the value for the
’analyzeduration’ and ’probesize’ options [h264 @ 0x11f06c0]
Estimating duration from bitrate, this may be inaccurate
udp ://127.0.0.1:5006 : could not find codec parametersHave anyone tried such approach before or experienced similar problem, I would like to get some direction on how to solve it. Thanks !