Recherche avancée

Médias (91)

Autres articles (32)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Configuration spécifique d’Apache

    4 février 2011, par

    Modules spécifiques
    Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
    Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
    Création d’un (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (4745)

  • After merge videos, the duration is too long - ffmpeg

    20 février 2017, par Thanh Dao

    I have file txt with content

    file intro.mp4
    file video.mp4
    file outtro.mp4

    with duration by 10s, 178s, 13s.

    I use ffmpeg to merge 3 files into one with below command :

    ffmpeg -f concat -i "file.txt" -vcodec copy -acodec copy "endfile.mp4"

    The duration of endfile.mp4 is longer 11 mins (660s).

    I have a question that which params of video affect to merge? And which common params to merge another videos?

    My English really too bad. Sorry for it :)
    Good working this week !

    P/S Details infor of files :

    intro.mp4 :

    ffprobe version N-82885-g6d09d6e Copyright (c) 2007-2016 the FFmpeg developers<br />
       built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-17)<br />
       configuration: --prefix=/root/ffmpeg_build --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags='-L/root/ffmpeg_build/lib -ldl' --<br />bindir=/root/bin --pkg-config-flags=--static --enable-gpl --enable-nonfree --enable-libfdk_aac --enable-libfreetype --enable-libmp3lame<br /> --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265<br />
       libavutil      55. 43.100 / 55. 43.100<br />
       libavcodec     57. 68.100 / 57. 68.100<br />
       libavformat    57. 61.100 / 57. 61.100<br />
       libavdevice    57.  2.100 / 57.  2.100<br />
       libavfilter     6. 68.100 /  6. 68.100<br />
       libswscale      4.  3.101 /  4.  3.101<br />
       libswresample   2.  4.100 /  2.  4.100<br />
       libpostproc    54.  2.100 / 54.  2.100<br />
     Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/path/to/intro.mp4':<br />
       Metadata:<br />
       major_brand     : isom<br />
       minor_version   : 512<br />
       compatible_brands: isomiso2avc1mp41<br />
       encoder         : Lavf56.23.100<br />
     Duration: 00:00:10.08, start: -0.013061, bitrate: 701 kb/s<br />
       Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)<br />
       Metadata:<br />
       handler_name    : SoundHandler<br />
       Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1920x1080, 853 kb/s, 24 fps, 24 tbr, 12288 tbn, 48 tbc (default)<br />
     Metadata:<br />
       handler_name    : VideoHandler<br />

    outtro.mp4 :

    ffprobe version N-82885-g6d09d6e Copyright (c) 2007-2016 the FFmpeg developers<br />
       built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-17)<br />
       configuration: --prefix=/root/ffmpeg_build --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags='-L/root/ffmpeg_build/lib -ldl' --<br />bindir=/root/bin --pkg-config-flags=--static --enable-gpl --enable-nonfree --enable-libfdk_aac --enable-libfreetype --enable-libmp3lame<br /> --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265<br />
       libavutil      55. 43.100 / 55. 43.100<br />
       libavcodec     57. 68.100 / 57. 68.100<br />
       libavformat    57. 61.100 / 57. 61.100<br />
       libavdevice    57.  2.100 / 57.  2.100<br />
       libavfilter     6. 68.100 /  6. 68.100<br />
       libswscale      4.  3.101 /  4.  3.101<br />
       libswresample   2.  4.100 /  2.  4.100<br />
       libpostproc    54.  2.100 / 54.  2.100<br />
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/path/to/outtro.mp4':<br />
    Metadata:<br />
       major_brand     : isom<br />
       minor_version   : 512<br />
       compatible_brands: isomiso2avc1mp41<br />
       encoder         : Lavf56.23.100<br />
    Duration: 00:00:13.08, start: -0.013061, bitrate: 481 kb/s<br />
    Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)<br />
    Metadata:<br />
       handler_name    : SoundHandler<br />
    Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1920x1080, 392 kb/s, 24 fps, 24 tbr, 12288 tbn, 48 tbc (default)<br />
    Metadata:<br />
       handler_name    : VideoHandler<br />

    video.mp4

    ffprobe version N-82885-g6d09d6e Copyright (c) 2007-2016 the FFmpeg developers<br />
       built with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-17)<br />
       configuration: --prefix=/root/ffmpeg_build --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags='-L/root/ffmpeg_build/lib -ldl' --<br />bindir=/root/bin --pkg-config-flags=--static --enable-gpl --enable-nonfree --enable-libfdk_aac --enable-libfreetype --enable-libmp3lame<br /> --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265<br />
       libavutil      55. 43.100 / 55. 43.100<br /><br />
       libavcodec     57. 68.100 / 57. 68.100<br /><br />
       libavformat    57. 61.100 / 57. 61.100<br /><br />
       libavdevice    57.  2.100 / 57.  2.100<br /><br />
       libavfilter     6. 68.100 /  6. 68.100<br /><br />
       libswscale      4.  3.101 /  4.  3.101<br /><br />
       libswresample   2.  4.100 /  2.  4.100<br /><br />
       libpostproc    54.  2.100 / 54.  2.100<br /><br />
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'path/to/video.mp4':<br />
       Metadata:<br />
           major_brand     : isom<br />
           minor_version   : 512<br />
           compatible_brands: isomiso2avc1mp41<br />
           encoder         : Lavf57.61.100<br />
       Duration: 00:02:58.38, start: 0.000000, bitrate: 922 kb/s<br />
           Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 782 kb/s, 29.97 fps, 29.97 tbr, 30k <br />tbn, 59.94 tbc (default)<br />
           Metadata:<br />
               handler_name    : VideoHandler<br />
           Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 130 kb/s (default)<br />
           Metadata:<br />
               handler_name    : SoundHandler<br />
  • MPEG-DASH - Multiplexed Representations Issue

    26 avril 2017, par Mike

    I’m trying to learn ffmpeg, MP4Box, and MPEG-DASH, but I’m running into an issue with the .mp4 I’m using. I’m using ffmpeg to demux the mp4 with this command :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy audio.mp4

    Once the two files are created, I use MP4Box to segment the files for the dash player using this command :

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4

    Which does create all the files I think I need. Then I point the player to the output_dash.mpd and nothing happens except a ton of messages in the console :

    [8] EME detected on this user agent! (ProtectionModel_21Jan2015)
    [11] Playback Initialized
    [21] [dash.js 2.3.0] MediaPlayer has been initialized
    [64] Parsing complete: ( xml2json: 3.42ms, objectiron: 2.61ms, total: 0.00603s)
    [65] Manifest has been refreshed at Wed Apr 12 2017 12:16:52 GMT-0600 (MDT)[1492021012.196]  
    [72] MediaSource attached to element.  Waiting on open...
    [77] MediaSource is open!
    [77] Duration successfully set to: 148.34
    [78] Added 0 inline events
    [78] No video data.
    [79] No audio data.
    [79] No text data.
    [79] No fragmentedText data.
    [79] No embeddedText data.
    [80] Multiplexed representations are intentionally not supported, as they are not compliant with the DASH-AVC/264 guidelines
    [81] No streams to play.

    Here is the MP4Box -info on the video I’m using :

    * Movie Info *
       Timescale 1000 - Duration 00:02:28.336
       Fragmented File no - 2 track(s)
       File suitable for progressive download (moov before mdat)
       File Brand mp42 - version 512
       Created: GMT Wed Feb  6 06:28:16 2036

    File has root IOD (9 bytes)
    Scene PL 0xff - Graphics PL 0xff - OD PL 0xff
    Visual PL: Not part of MPEG-4 Visual profiles (0xfe)
    Audio PL: Not part of MPEG-4 audio profiles (0xfe)
    No streams included in root OD

    iTunes Info:
       Name: Rogue One - A Star Wars Story
       Artist: Lucasfilm
       Genre: Trailer
       Created: 2016
       Encoder Software: HandBrake 0.10.2 2015060900
       Cover Art: JPEG File

    Track # 1 Info - TrackID 1 - TimeScale 90000 - Duration 00:02:28.335
    Media Info: Language "Undetermined" - Type "vide:avc1" - 3552 samples
    Visual Track layout: x=0 y=0 width=1920 height=816
    MPEG-4 Config: Visual Stream - ObjectTypeIndication 0x21
    AVC/H264 Video - Visual Size 1920 x 816
       AVC Info: 1 SPS - 1 PPS - Profile High @ Level 4.1
       NAL Unit length bits: 32
       Pixel Aspect Ratio 1:1 - Indicated track size 1920 x 816
    Self-synchronized

    Track # 2 Info - TrackID 2 - TimeScale 44100 - Duration 00:02:28.305
    Media Info: Language "English" - Type "soun:mp4a" - 6387 samples
    MPEG-4 Config: Audio Stream - ObjectTypeIndication 0x40
    MPEG-4 Audio MPEG-4 Audio AAC LC - 2 Channel(s) - SampleRate 44100
    Synchronized on stream 1
    Alternate Group ID 1

    I know I need to separate the video and audio and I think that’s where my issue is. The command I’m using probably isn’t doing the right thing.

    Is there a better command to demux my mp4 ?
    Is the MP4Box command I’m using best for segmenting the files ?
    If I use different files, will they always need to be demuxed ?

    One thing to mention, if I use the following commands everything works fine, but there is no audio because of the -an which means it’s only video :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an output.mp4

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4

    UPDATE

    I noticed that the video had no audio stream, but the audio had the video stream which is why I got the mux error. I thought that might be an issue so I ran this command to keep the unwanted streams out of the outputs :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy -vn audio.mp4

    then I run :

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ video.mp4 audio.mp4

    now I no longer get the Multiplexed representations are intentionally not supported... message, but now I get :

    [122] Video Element Error: MEDIA_ERR_SRC_NOT_SUPPORTED
    [123] [object MediaError]
    [125] Schedule controller stopping for audio
    [126] Caught pending play exception - continuing (NotSupportedError: Failed to load because no supported source was found.)

    I tried playing the video and audio independently through Chrome and they both work, just not through the dash player. Ugh, this is painful to learn, but I feel like I’m making progress.

  • Recording a webpage stream with multiple requests using PhantomJS & ffmpeg to /dev/stdout leads to ffmpeg error

    2 septembre 2016, par Allisson Ferreira

    First of all, sorry for my english.

    I’m in a quest for days. I’ve researched everywhere and I couldn’t find an answer to my problem.

    I’m using Nodejs, Phantomjs and ffmpeg in this scenary :

    • An user enters the site, login with facebook and he can ask for a video with his name and some random photos (gathered by /me/ & sent via JSON POST) ;
    • Node receive the user data, creates a child process (PhantomJS + ffmpeg) and awaits for a response to send the video URL to the user.

    When I run a single instance of this request, everything is working fine. BUT, when two or more users make the request, only one video is sent and the others process end up in a ffmpeg stream error.

    I think the reason is that all the ffmpeg processes are using the same place (/dev/stdout). Since one process is already using it, the others enters in a "can’t access" error. But it is a assumption, I don’t know how /dev/stdout really works.

    Here are my codes. (I have removed some lines and renamed some variables for better understanding, sorry for any mistake)

    index.js :

    var generateVideo = 'phantomjs phantom.js '+videoID+' '+userID+' | ffmpeg -vcodec png -f image2pipe -r 30 -i - -pix_fmt yuv420p public/videos/'+userID+'/'+videoID+'.mp4 -y';

    childProcess.exec(generateVideo, function(err, stdout, stderr) {
       var json    = {};
       json.video  = '/videos/'+userID+'/'+videoID+'.mp4';
       res.send(json);
    });

    phantom.js :

    var page            = require('webpage').create();
    page.viewportSize   = { width: 1366, height: 768 };
    page.settings.resourceTimeout = 10000;

    var args            = require('system').args;
    var videoID         = args[1];
    var userID          = args[2];

    page.open('http://localhost:3000/recordvideo/'+videoID, 'post', function(status){
       var frame       = 0;
       var target_fps  = 30;
       var maxframes   = page.evaluate(function () {
                           return getTotalDurationInSeconds();
                       }) * target_fps;

       setInterval(function(){
           page.render('/dev/stdout', { format: "png" });
           if( frame >= maxframes ){
               phantom.exit();
           }
           frame++;
       }, (1000 / target_fps));
    });

    And the error :

    [Error: Command failed: /bin/sh -c phantomjs phantom.js XXXXXXXX XXXXXXXX | ffmpeg -vcodec png -f image2pipe -r 30 -i - -pix_fmt yuv420p public/videos/XXXXXXXX/XXXXXXXX.mp4 -y
    www-0 ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers
    www-0   built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
    www-0   configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
    www-0   libavutil      55. 28.100 / 55. 28.100
    www-0   libavcodec     57. 48.101 / 57. 48.101
    www-0   libavformat    57. 41.100 / 57. 41.100
    www-0   libavdevice    57.  0.102 / 57.  0.102
    www-0   libavfilter     6. 47.100 /  6. 47.100
    www-0   libavresample   3.  0.  0 /  3.  0.  0
    www-0   libswscale      4.  1.100 /  4.  1.100
    www-0   libswresample   2.  1.100 /  2.  1.100
    www-0   libpostproc    54.  0.100 / 54.  0.100
    www-0 [png @ 0x3d7c4a0] Invalid PNG signature 0x46726F6D20506861.
    www-0 [image2pipe @ 0x3d72780] decoding for stream 0 failed
    www-0 [image2pipe @ 0x3d72780] Could not find codec parameters for stream 0 (Video: png, none(pc)): unspecified size
    www-0 Consider increasing the value for the 'analyzeduration' and 'probesize' options
    www-0 Input #0, image2pipe, from 'pipe:':
    www-0   Duration: N/A, bitrate: N/A
    www-0     Stream #0:0: Video: png, none(pc), 30 tbr, 30 tbn, 30 tbc
    www-0 [buffer @ 0x3d81540] Unable to parse option value "0x0" as image size
    www-0 [buffer @ 0x3d81540] Unable to parse option value "-1" as pixel format
    www-0 [buffer @ 0x3d81540] Unable to parse option value "0x0" as image size
    www-0 [buffer @ 0x3d81540] Error setting option video_size to value 0x0.
    www-0 [graph 0 input from stream 0:0 @ 0x3d72600] Error applying options to the filter.
    www-0 Error opening filters!
    www-0 ]

    I really hope that I can find an answer here !
    And sorry if there already is an answer for this. But I researched for days.

    Thank you in advance !