Recherche avancée

Médias (3)

Mot : - Tags -/collection

Autres articles (112)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (14236)

  • Create a video with timestamp from multiple images with "picture taken date/time" in the meta data with ffmpeg or similar ?

    16 mars 2018, par m4D_guY

    I have two time lapse videos with a rate of 1 fps. The camera took 1 Image every minute. Unfortunately it was missed to set the camera to burn/print on every image the time and date. I am trying to burn the time and date afterwards into the video.

    I decoded with ffmpeg the two .avi files into  7000 single images each and wrote a R script that renamed the files into their "creation" date (the time and date the pictures where taken). Then i used exiftoolto write those information "into" the file, into their exif data or meta data or whatever this is called.

    The final images in the folder are looking like this :

    2018-03-12 17_36_40.png

    2018-03-12 17_35_40.png

    2018-03-12 17_34_40.png

    ...

    Is it possible to create a Video from these images again with ffmpeg or similiar with a "timestamp" in the video so you can see while watching a time and date stamp in the video ?

  • ffmpeg : Streaming with RTSP and recording at the same time

    1er décembre 2017, par Barzou

    My goal is to stream a part of my desktop to a remote machine, and to record the sent stream at the same time.
    The streaming technology chosen is the RTSP protocol with FFserver as rtsp server.

    I already succeed the streaming part with ffmpeg, but now I’m trying to record the stream at the same time. To do so, I’m trying to use the tee pseudo-muxer as referenced here :
    http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs

    https://ffmpeg.org/ffmpeg-formats.html#tee

    Unfortunately, I’m having an error message when launching my command which is the following :

    ffmpeg -loglevel debug  -f alsa -ac 2 -thread_queue_size 512 -ar 44100 -i hw:0 \
    -f x11grab -framerate 30 -show_region 1 -video_size 800x480 -thread_queue_size 512 -i :0.0 \
    -f tee -c:v libx264 -c:a mp3 -pix_fmt yuv422p  -flags +global_header -map 1:v -map 0:a \
    "test.mp4|[f=ffm]http://localhost:8000/feed.ffm/"

    And here is the output :

     ffmpeg version 3.4-1~xenial1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.5) 20160609
     configuration: --prefix=/usr --extra-version='1~xenial1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-nonfree --enable-libfdk-aac --enable-nvenc --enable-libdc1394 --enable-libiec61883 --enable-libopencv --enable-frei0r --enable-libx264 --enable-shared
     libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libavresample   3.  7.  0 /  3.  7.  0
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
     libpostproc    54.  7.100 / 54.  7.100
    Guessed Channel Layout for Input Stream #0.0 : stereo
    Input #0, alsa, from 'hw:0':
     Duration: N/A, start: 1512038836.756092, bitrate: 1411 kb/s
       Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
    Input #1, x11grab, from ':0.0':
     Duration: N/A, start: 1512038836.789836, bitrate: N/A
       Stream #1:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 800x480, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
    Stream mapping:
     Stream #1:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
     Stream #0:0 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame))
    Press [q] to stop, [?] for help
    [libx264 @ 0x5621206c0180] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
    [libx264 @ 0x5621206c0180] profile High 4:2:2, level 3.1, 4:2:2 8-bit
    [libx264 @ 0x5621206c0180] 264 - core 148 r2643 5c65704 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    [tee @ 0x5621206bef60] Slave '[f=ffm]http://localhost:8000/feed.ffm/': error writing header: Broken pipe
    [tee @ 0x5621206bef60] Slave muxer #1 failed, aborting.
    Could not write header for output file #0 (incorrect codec parameters ?):

    Broken pipe
    Error initializing output stream 0:1 --
    Conversion failed!

    I also tried with f=rtsp, with no chance and this error message :

    [tee @ 0x559b26ea2f40] Slave '[f=rtsp]http://localhost:8000/feed.ffm/': error writing header: Invalid data found when processing input
    [tee @ 0x559b26ea2f40] Slave muxer #1 failed, aborting.
    Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input
    Error initializing output stream 0:1 --

    FFserver configuration :

    <feed>
      File /tmp/feed.ffm
      FileMaxSize 20M
    </feed>

    <stream>
      Feed feed.ffm
      Format rtp
      VideoSize 800x480
      VideoQMin 1
      VideoQMax 25
      VideoFrameRate 30
      VideoBitRate 8000
      VideoCodec libx264
      PixelFormat yuv422p


      #h264 options
      AVOptionVideo flags +global_header
      AVOptionVideo preset ultrafast
      AVOptionVideo crf 25
      AVOptionVideo profile high422
      AVOptionVideo tune zerolatency

      NoAudio
    </stream>

    "Slave ’[f=ffm]http://localhost:8000/feed.ffm/’ : error writing header : Broken pipe" is the key, but I can’t figure what is wrong in my command. Without tee, it works flawlessly.

    I did many searches and cannot find the problem yet, so I’m asking here.

    Thanks,
    Barzou.

  • Unable to get current time in TCL. can i use flush here ?

    30 novembre 2017, par M. D. P

    unable to get current time for the bellow code :

    proc a {} {

            for {set i 0} {$i &lt; 3} {incr i} {
       puts " $i "


                    set imagetime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
                    set videotime [clock format [clock seconds] -format %Y%m%d_%H%M%S]
       
                    exec ffmpeg -f dshow -i "video=Integrated Webcam" -s 1280x720 -benchmark c:/test/Image_$imagetime.jpg >&amp; c:/test/image_$imagetime.txt &amp;

                    after 15000

                    exec ffmpeg -f dshow -t 00:00:10 -i "video=Integrated Webcam" -s 1280x720 -benchmark c:/test/video_$videotime.avi >&amp; c:/test/video_$videotime.txt &amp;
                   
                    after 15000
            }
    }
    a

    the output is :

    enter image description here

    the problem here is, even though the variable for taking video time and image time is different, it is tacking same time for video and image.

    any reason or solution for this ????

    can i use flush command ???

    can i work with following code :

    proc a {} {

            for {set i 0} {$i &lt; 3} {incr i} {
       puts " $i "

    set time [clock format [clock seconds] -format %Y%m%d_%H%M%S]
                         
                    exec ffmpeg -f dshow -i "video=Integrated Webcam" -s 1280x720 -benchmark c:/test/Image_$time.jpg >&amp; c:/test/image_$time.txt &amp;

                    after 15000

                    exec ffmpeg -f dshow -t 00:00:10 -i "video=Integrated Webcam" -s 1280x720 -benchmark c:/test/video_$time.avi >&amp; c:/test/video_$time.txt &amp;
                   
                    after 15000
            }
    }
    a

    any answer ??