Recherche avancée

Médias (1)

Mot : - Tags -/musée

Autres articles (112)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (7866)

  • Video concatenation puts sound out of sync

    9 août 2019, par mmorin

    (Cross-posted from Video Production, where the question received no answers and may be more technical than usual video production.)

    I have several MOV files from a DSLR camera. I concatenate them with directions from this thread :

    ffmpeg -safe 0 -f concat -i files_to_combine -vcodec copy -acodec copy temp.MOV

    where files_to_combine is :

    file ./DSC_0013.MOV
    ...
    file ./DSC_0019.MOV

    The result has image and sound in sync for the first clip and is out of sync by fractions of a second in the second clip, and out of sync by around a second for the last clip. It is probably related to this error from the log :

    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] st: 0 edit list: 1 Missing key frame while searching for timestamp: 1000
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] st: 0 edit list 1 Cannot find an index entry before timestamp: 1000.
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] Auto-inserting h264_mp4toannexb bitstream filter

    How can I trim the frames to the available sound stream, then concatenate the two videos ?

    The full log from the ffmpeg command is :

    ffmpeg version 4.1.3 Copyright (c) 2000-2019 the FFmpeg developers
     built with Apple LLVM version 10.0.1 (clang-1001.0.46.4)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/4.1.3_1 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/adoptopenjdk-11.0.2.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/adoptopenjdk-11.0.2.jdk/Contents/Home/include/darwin' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-videotoolbox --disable-libjack --disable-indev=jack --enable-libaom --enable-libsoxr
     libavutil      56. 22.100 / 56. 22.100
     libavcodec     58. 35.100 / 58. 35.100
     libavformat    58. 20.100 / 58. 20.100
     libavdevice    58.  5.100 / 58.  5.100
     libavfilter     7. 40.101 /  7. 40.101
     libavresample   4.  0.  0 /  4.  0.  0
     libswscale      5.  3.100 /  5.  3.100
     libswresample   3.  3.100 /  3.  3.100
     libpostproc    55.  3.100 / 55.  3.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dc00e000] Auto-inserting h264_mp4toannexb bitstream filter
    Input #0, concat, from 'files_to_combine':
     Duration: N/A, start: -0.592000, bitrate: 36888 kb/s
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/bt709/bt470m), 1920x1080, 35352 kb/s, 50 fps, 50 tbr, 50k tbn, 100 tbc
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, stereo, s16, 1536 kb/s
       Metadata:
         handler_name    : SoundHandler
    Output #0, mov, to 'temp.MOV':
     Metadata:
       encoder         : Lavf58.20.100
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/bt709/bt470m), 1920x1080, q=2-31, 35352 kb/s, 50 fps, 50 tbr, 50k tbn, 50k tbc
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, stereo, s16, 1536 kb/s
       Metadata:
         handler_name    : SoundHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] st: 0 edit list: 1 Missing key frame while searching for timestamp: 1000
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] st: 0 edit list 1 Cannot find an index entry before timestamp: 1000.
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f82dd802200] Auto-inserting h264_mp4toannexb bitstream filter
    frame=41886 fps=547 q=-1.0 Lsize= 3789826kB time=00:13:58.75 bitrate=37014.8kbits/s speed=10.9x    
    video:3631879kB audio:157123kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.021759%

    Update (1 July 2019)

    I thought that the files had a problem at the beginning or at the end, so I
    trimmed one second from each end, but it still had the sound out of sync :

    FILES=files_to_combine
    OUTPUT=show2.MOV
    rm $FILES
    for i in 3 4 5 6 7 8 9; do
       rm ${i}.MOV
       duration=$(ffprobe -v 0 -show_entries format=duration -of compact=p=0:nk=1  DSC_001${i}.MOV)
       trimmed=$(echo $duration - 1 | bc)
       ffmpeg -ss 1 -t $trimmed -i DSC_001${i}.MOV -vcodec copy -acodec copy ${i}.MOV
       echo file ./${i}.MOV >> $FILES
    done

    rm $OUTPUT
    ffmpeg -safe 0 -f concat -i $FILES -vcodec copy -acodec copy $OUTPUT

    When I trim a single file near the end, the sound and video do not seem out of sync :

    ffmpeg -ss 00:09:20 -t 20 -i DSC_0014.MOV -vcodec copy -acodec copy end.MOV

    When I concatenate only 30 seconds from each video, the result seems OK :

    FILES=files_to_combine
    OUTPUT=show2.MOV
    rm $FILES
    for i in 3 4 5 6 7 8 9; do
       rm ${i}.MOV
       duration=$(ffprobe -v 0 -show_entries format=duration -of compact=p=0:nk=1  DSC_001${i}.MOV)
       start=$(echo $duration - 30 | bc)
       end=$(echo $duration - 1 | bc)
       ffmpeg -ss $start -t $end -i DSC_001${i}.MOV -vcodec copy -acodec copy ${i}.MOV
       echo file ./${i}.MOV >> $FILES
    done

    rm $OUTPUT
    ffmpeg -safe 0 -f concat -i $FILES -vcodec copy -acodec copy $OUTPUT

    This last concatenation gives this error multiple times :

    [mov @ 0x7fc3c7837400] Non-monotonous DTS in output stream 0:0; previous: 9080205, current: 9080200; changing to 9080206. This may result in incorrect timestamps in the output file.

    So I am guessing that the problem is small differences in timestamps that
    accumulate and become more noticeable with longer durations and the
    concatenation of multiple files.

    For reference, the DSLR that shot these clips is a Nikon D3300 and the result
    of ffprobe on one of the files is :

    $ ffprobe DSC_0017.MOV -hide_banner
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fab70003800] st: 0 edit list: 1 Missing key frame while searching for timestamp: 1000
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fab70003800] st: 0 edit list 1 Cannot find an index entry before timestamp: 1000.
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'DSC_0017.MOV':
     Metadata:
       major_brand     : qt  
       minor_version   : 537331968
       compatible_brands: qt  niko
       creation_time   : 2019-06-12T23:52:37.000000Z
     Duration: 00:09:53.58, start: 0.000000, bitrate: 36843 kb/s
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/bt709/bt470m), 1920x1080, 35300 kb/s, 50 fps, 50 tbr, 50k tbn, 100 tbc (default)
       Metadata:
         creation_time   : 2019-06-12T23:52:37.000000Z
       Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, 2 channels, s16, 1536 kb/s (default)
       Metadata:
         creation_time   : 2019-06-12T23:52:37.000000Z

    Update (9 August 2019)

    I concatenated the files in iMovie and the sound and image are not as out of sync as with FFMPEG. Maybe iMovie aligns the timestamps at the end of each clip instead of concatenating the audio and image streams separately.

    I ran the concatenation again with the latest ffmpeg 4.1.4_1 on these files and others from the same camera. The audio and image are in sync in one case (the results lasts 46 minutes) out of sync in another (the result lasts 48 minutes).

  • FFMPEG stops converting

    19 février 2014, par user3328745

    I've got Ubuntu 12.04 LTS, which runs Wowza Media Server, so I use FFmpeg as a transcoder for live streaming and JWplayer on my website. But ffmpeg always stops converting, and I have to input the command again and again. So here is the command :

    nohup ffmpeg -i rtsp://log:pass@<cameraip>:554/live1.sdp -ar 44100 -ab 128k -f flv -b 5000k -s 480x270 -y rtmp://<serverip>:1935/live/camera.stream &amp;
    </serverip></cameraip>

    And that's what i get

    ffmpeg version 0.8.10-4:0.8.10-0ubuntu0.12.04.1, Copyright (c) 2000-2013 the Libav developers
     built on Feb  6 2014 20:56:59 with gcc 4.6.3
    *** THIS PROGRAM IS DEPRECATED ***
    This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
    [rtsp @ 0x25317a0] Estimating duration from bitrate, this may be inaccurate

    Seems stream 0 codec frame rate differs from container frame rate: 150.00 (150/1) -> 1000.00 (1000/1)
    Input #0, rtsp, from &#39;rtsp://log:pass@<cameraip>:554/live1.sdp&#39;:
     Metadata:
       title           : RTSP/RTP stream 1 from DCS-2132L
       comment         : live1.sdp with v2.0
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0.0: Video: h264 (High), yuvj420p, 640x360 [PAR 1:1 DAR 16:9], 75 fps, 1k tbr, 90k tbn, 150 tbc
       Stream #0.1: Audio: pcm_mulaw, 8000 Hz, 1 channels, s16, 64 kb/s
    Incompatible pixel format &#39;yuvj420p&#39; for codec &#39;mpeg4&#39;, auto-selecting format &#39;yuv420p&#39;
    [buffer @ 0x2539f80] w:640 h:360 pixfmt:yuvj420p
    [scale @ 0x253a940] w:640 h:360 fmt:yuvj420p -> w:480 h:270 fmt:yuv420p flags:0x4
    Incompatible sample format &#39;s16&#39; for codec &#39;ac3&#39;, auto-selecting format &#39;flt&#39;
    [ac3 @ 0x2531120] channel_layout not specified
    [ac3 @ 0x2531120] No channel layout specified. The encoder will guess the layout, but it might be incorrect.
    [ac3 @ 0x2531120] invalid bit rate
    Output #0, avi, to &#39;rtmp://<serverip>:1935/live/camera.stream&#39;:
       Stream #0.0: Video: mpeg4, yuv420p, 480x270 [PAR 1:1 DAR 16:9], q=2-31, 1024 kb/s, 90k tbn, 1k tbc
       Stream #0.1: Audio: ac3, 22050 Hz, mono, flt, 1024 kb/s
    Stream mapping:
     Stream #0.0 -> #0.0
     Stream #0.1 -> #0.1
    Error while opening encoder for output stream #0.1 - maybe incorrect parameters such as bit_rate, rate, width or height
    </serverip></cameraip>

    Plese, help me to correct the errors

  • HTTP Header for Duration of a MP4 for HTML 5 video

    9 mars 2014, par Mustafa

    I am trying to stream MP4 video as it is encoded from a webserver. I believe I used the appropriate flags, but it is not working correctly. When I download the video from my stream and open it with VLC, it properly shows the duration. Since a socket is not seekable, I assume it writes the metadata to end ? My Chrome browser always shows 8 seconds duration. The first 8 seconds plays at the normal speed, but afterwards the pause button turns into play button and the video plays very fast, probably as fast as it is recieved. However the audio is played at normal speed. I tried document.getElementById(&#39;myVid&#39;).duration = 20000 but it is a readonly field.

    I wonder, is there anyway to explicitly state the duration in HTTP headers or in any other way ? I cannot find any documentation about it.

    ffmpeg -i - -vcodec libx264 -acodec libvo_aacenc -ar 44100 -ac 2 -ab 128000 -f mp4 -movflags frag_keyframe+faststart pipe:1 -fflags +genpts -re -profile baseline -level 30 -preset fast

    To close-voters, that thinks it is not programming related, I use it in my own server I coded, and I need to set the duration programatically via JavaScript or setting the HTTP header. I believe it may be related to both ffmpeg or http headers, that's why I posted it here.

    app.get("/video/*", function(req,res){
       res.writeHead(200, {
           &#39;Content-Type&#39;: &#39;video/mp4&#39;,
       });
       var dir = req.url.split("/").splice(2).join("/");
       var buf = new Buffer(dir, &#39;base64&#39;);
       var src = buf.toString();

       var Transcoder = require(&#39;stream-transcoder&#39;);
       var stream = fs.createReadStream(src);
       // I added my own flags to this module, they are at below:
       new Transcoder(stream)
           .videoCodec(&#39;libx264&#39;)
           .audioCodec("libvo_aacenc")
           .sampleRate(44100)
           .channels(2)
           .audioBitrate(128 * 1000)
           .format(&#39;mp4&#39;)
           .on(&#39;finish&#39;, function() {
               console.log("finished");
           })
           .stream().pipe(res);
    });

    exec function in that stream-transcoder module,

       a.push("-fflags");
       a.push("+genpts");
       a.push("-re");
       a.push("-profile");
       a.push("baseline");
       a.push("-level");
       a.push("30");
       a.push("-preset");
       a.push("fast");
       a.push("-strict");
       a.push("experimental");
       a.push("-frag_duration");
       a.push("" + 2 * (1000 * 1000));
       var child = spawn(&#39;ffmpeg&#39;, a, {
           cwd: os.tmpdir()
       });