Recherche avancée

Médias (29)

Mot : - Tags -/Musique

Autres articles (111)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

Sur d’autres sites (17092)

  • what is wrong with this ffmpeg command ?

    16 juillet 2013, par kheya

    I am trying to convert 3gp movie to H264 video (mp4)
    This is what I am using :

    ffmpeg -i file8.3gp -vcodec libx264 -preset slow -vf scale="720:trunc(ow/a/2)*2" -threads 0 -acodec libvo_aacenc -b:a 128k "file8.mp4"

    The problem is original file is 28kb but after conversion I get 155kb mp4 file
    Why is the file so bloated - is that because of the size or some other option ?

    Here is what I see on 3gp input file : NOTE THE BIT RATE IS 46 KB/S

    libavutil      52. 39.100 / 52. 39.100
     libavcodec     55. 18.102 / 55. 18.102
     libavformat    55. 12.102 / 55. 12.102
     libavdevice    55.  3.100 / 55.  3.100
     libavfilter     3. 80.101 /  3. 80.101
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'file8.3gp':
     Metadata:
       major_brand     : 3gp5
       minor_version   : 256
       compatible_brands: 3gp53gp4
       creation_time   : 2005-10-28 17:36:40
     Duration: 00:00:04.93, start: 0.000000, bitrate: 46 kb/s
       Stream #0:0(eng): Audio: amr_nb (samr / 0x726D6173), 8000 Hz, mono, flt, 8 k
    b/s
       Metadata:
         creation_time   : 2005-10-28 17:36:40
         handler_name    : Apple Sound Media Handler
       Stream #0:1(eng): Video: mpeg4 (Simple Profile) (mp4v / 0x7634706D), yuv420p
    , 176x144 [SAR 1:1 DAR 11:9], 35 kb/s, 15 fps, 15 tbr, 600 tbn, 1k tbc
       Metadata:
         creation_time   : 2005-10-28 17:36:40
         handler_name    : Apple Video Media Handler

    Here is what I see on mp4 output file : NOTE THE BIT RATE IS 243 KB/S

    libavutil      52. 39.100 / 52. 39.100
     libavcodec     55. 18.102 / 55. 18.102
     libavformat    55. 12.102 / 55. 12.102
     libavdevice    55.  3.100 / 55.  3.100
     libavfilter     3. 80.101 /  3. 80.101
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'file8.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.12.102
     Duration: 00:00:05.20, start: 0.200000, bitrate: 243 kb/s
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 720x588 [
    SAR 539:540 DAR 11:9], 240 kb/s, 15 fps, 15 tbr, 15360 tbn, 30 tbc
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 8000 Hz, mono, fltp, 10 kb
    /s
       Metadata:
         handler_name    : SoundHandler
    At least one output file must be specified
  • Generated ffmpeg video from images has the wrong orientation

    14 septembre 2020, par asored

    I create a video from image files in Flutter (with ffmpeg) with this code :

    


    -r $settings_fps -i $tempPath/img%04d.jpg -vcodec libx264 -y -an -vf "pad=ceil(iw/2)*2:ceil(ih/2)*2" -pix_fmt yuv420p $tempPath/$videoFileName.mp4

    


    Then I add audio to the video file :

    


    -i ${finalSong.path} -i $videoPath -c:v copy -c:a aac -shortest $tempPath/$newVideoFileName.mp4

    


    It is possible that ffmpeg uses the AspectRatio of the first image ? And that ffmpeg detects automatically if it has a landscape or a portrait orientation ?

    


    Actually if I have images made in portrait orientation this is the video output :

    


    enter image description here

    


    I know that I can rotate the video. But users have the possibility to add images in Portrait or Landscape orientation..so I don´t want manually rotate this video.

    


    Do you have another solution for this ? Thank you !

    


  • What's wrong with my use of timestamps/timebases for frame seeking/reading using libav (ffmpeg) ?

    17 septembre 2013, par mtree

    So I want to grab a frame from a video at a specific time using libav for the use as a thumbnail.

    What I'm using is the following code. It compiles and works fine (in regards to retrieving a picture at all), yet I'm having a hard time getting it to retrieve the right picture.

    I simply can't get my head around the all but clear logic behind libav's apparent use of multiple time-bases per video. Specifically figuring out which functions expect/return which type of time-base.

    The docs were of basically no help whatsoever, unfortunately. SO to the rescue ?

    #define ABORT(x) do {fprintf(stderr, x); exit(1);} while(0)

    av_register_all();

    AVFormatContext *format_context = ...;
    AVCodec *codec = ...;
    AVStream *stream = ...;
    AVCodecContext *codec_context = ...;
    int stream_index = ...;

    // open codec_context, etc.

    AVRational stream_time_base = stream->time_base;
    AVRational codec_time_base = codec_context->time_base;

    printf("stream_time_base: %d / %d = %.5f\n", stream_time_base.num, stream_time_base.den, av_q2d(stream_time_base));
    printf("codec_time_base: %d / %d = %.5f\n\n", codec_time_base.num, codec_time_base.den, av_q2d(codec_time_base));

    AVFrame *frame = avcodec_alloc_frame();

    printf("duration: %lld @ %d/sec (%.2f sec)\n", format_context->duration, AV_TIME_BASE, (double)format_context->duration / AV_TIME_BASE);
    printf("duration: %lld @ %d/sec (stream time base)\n\n", format_context->duration / AV_TIME_BASE * stream_time_base.den, stream_time_base.den);
    printf("duration: %lld @ %d/sec (codec time base)\n", format_context->duration / AV_TIME_BASE * codec_time_base.den, codec_time_base.den);

    double request_time = 10.0; // 10 seconds. Video's total duration is ~20sec
    int64_t request_timestamp = request_time / av_q2d(stream_time_base);
    printf("requested: %.2f (sec)\t-> %2lld (pts)\n", request_time, request_timestamp);

    av_seek_frame(format_context, stream_index, request_timestamp, 0);

    AVPacket packet;
    int frame_finished;
    do {
       if (av_read_frame(format_context, &packet) < 0) {
           break;
       } else if (packet.stream_index != stream_index) {
           av_free_packet(&packet);
           continue;
       }
       avcodec_decode_video2(codec_context, frame, &frame_finished, &packet);
    } while (!frame_finished);

    // do something with frame

    int64_t received_timestamp = frame->pkt_pts;
    double received_time = received_timestamp * av_q2d(stream_time_base);
    printf("received:  %.2f (sec)\t-> %2lld (pts)\n\n", received_time, received_timestamp);

    Running this with a test movie file I get this output :

       stream_time_base: 1 / 30000 = 0.00003
       codec_time_base: 50 / 2997 = 0.01668

       duration: 20062041 @ 1000000/sec (20.06 sec)
       duration: 600000 @ 30000/sec (stream time base)
       duration: 59940 @ 2997/sec (codec time base)

       requested: 10.00 (sec)  -> 300000 (pts)
       received:  0.07 (sec)   -> 2002 (pts)

    The times don't match. What's going on here ? What am I doing wrong ?


    While searching for clues I stumbled upon this this statement from the libav-users mailing list…

    [...] packet PTS/DTS are in units of the format context's time_base,
    where the AVFrame->pts value is in units of the codec context's time_base.

    In other words, the container can have (and usually does) a different
    time_base than the codec. Most libav players don't bother using the
    codec's time_base or pts since not all codecs have one, but most
    containers do. (This is why the dranger tutorial says to ignore AVFrame->pts)

    …which confused me even more, given that I couldn't find any such mention in the official docs.

    Anyway, I replaced…

    double received_time = received_timestamp * av_q2d(stream_time_base);

    …with…

    double received_time = received_timestamp * av_q2d(codec_time_base);

    …and the output changed to this…

    ...

    requested: 10.00 (sec)  -> 300000 (pts)
    received:  33.40 (sec)  -> 2002 (pts)

    Still no match. What's wrong ?