Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (66)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Soumettre améliorations et plugins supplémentaires

    10 avril 2011

    Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
    Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...)

Sur d’autres sites (9393)

  • using Qt's QProcess as popen (with ffmpeg rawvideo)

    9 octobre 2019, par Alex

    I inserted some code in a video application to export using ffmpeg
    with stdin (rawideo rgba format), to quickly test that it worked I
    used popen(), the tests went well and since the application is
    written using Qt I thought of modify the patch using QProcess and
    ->write().

    The application shows no errors and works properly but the generated
    video files are not playable neither with vlc nor with mplayer while
    those generated with popen() work perfectly with both. I have the
    feeling that ->close() or ->terminate() does not properly close
    ffmpeg and consequently the file, but I don’t know how to verify it
    nor I found alternative ways to end the executed command, beside
    ->waitForBytesWritten() should wait for the data to be written,
    suggestions ? Am I doing something wrong ?

    (Obviously I can’t prepare a testable example it would take me more
    time than the patch took)

    Below is the code I entered, in the case #else the Qt code

    Initialization

    #if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
         pipe_frame.file = popen("/tmp/ffmpeg-rawpipe.sh", "w");
         if (pipe_frame.file == NULL) {
           return false;
         }
    #else
         pipe_frame.qproc = new QProcess;
         pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly);
         if(!pipe_frame.qproc->waitForStarted()) {
           return false;
         }
    #endif

    Writing a frame

    #if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
             fwrite(pipe_frame.data, pipe_frame.width*4*pipe_frame.height , 1, pipe_frame.file);
    #else
             qint64 towrite = pipe_frame.width*4*pipe_frame.height,
               written = 0, partial;
             while(written < towrite) {
               partial = pipe_frame.qproc->write(&pipe_frame.data[written], towrite-written);
               pipe_frame.qproc->waitForBytesWritten(-1);
               written += partial;
             }
    #endif

    Termination

    #if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
         pclose(pipe_frame.file);
    #else
         pipe_frame.qproc->terminate();
         //pipe_frame.qproc->close();
    #endif

    edit

    ffmpeg-rawpipe.sh

    #!/bin/sh
    exec ffmpeg-cuda -y -f rawvideo -s 1920x1080 -pix_fmt rgba -r 25 -i - -an -c:v h264_nvenc \
           -cq:v 19 \
           -profile:v high /tmp/test.mp4

    I made some changes, I added the unbuffered flag to the open

    pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly|QIODevice::Unbuffered);

    And therefore simplified the write

    qint64 towrite = pipe_frame.width*4*pipe_frame.height;
    pipe_frame.qproc->write(pipe_frame.data, towrite);
    pipe_frame.qproc->waitForBytesWritten(-1);

    I added a closeWriteChannel before closing the application (hoping that stopping the stdin ffmpeg pipe ends properly, just in case, I’m not sure it doesn’t)

    pipe_frame.qproc->waitForBytesWritten(-1);
    pipe_frame.qproc->closeWriteChannel();
    //pipe_frame.qproc->terminate();
    pipe_frame.qproc->close();

    But nothing changes, the mp4 file is created and contains data but from the mplayer log I see that it is misinterpreted, the video format is not recognized and it looks for an audio that is not there.

  • avformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()

    22 janvier 2020, par Andreas Rheinhardt
    avformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()
    

    avio_close_dyn_buf() has a bug : When the write pointer does not point to
    the end of the written data when calling it (i.e. when one has performed
    a seek back to update already written data), it would not add padding to
    the end of the buffer, but to the current position, overwriting other
    data ; furthermore the reported size would be wrong (off by the amount of
    data it has overwritten with padding).

    In order not to run into this when updating already written elements or
    elements for which size has only been reserved, the Matroska muxer would
    first record the current position of the dynamic buffer, then seek to
    the desired position, perform the update and seek back to the earlier
    position.

    But now that end_ebml_master_crc32() does not make use of
    avio_close_dyn_buf() any more, this is no longer necessary.

    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@gmail.com>

    • [DH] libavformat/matroskaenc.c
  • what is a steps to apply filter to video using ffmpeg c program android ndk

    13 juillet 2016, par Rajkumar

    I am newbie to ffmpeg with android. i have an input video birds.mp4 that have following settings

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'birds.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf54.59.106
     Duration: 00:00:21.60, start: 0.000000, bitrate: 574 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 320x176 [
    SAR 44:45 DAR 16:9], 440 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, flt
    p, 127 kb/s (default)
       Metadata:
         handler_name    : SoundHandler

    i have compiled ffmpeg as *.so file for android. And i write a Jni wrapper. Now i want to apply a filter to that input video and save it to output.mp4 with same setting as shown. i know basic terminologies encoding, decoding, muxing, transcoding.
    I want to know How to do this task ? and What are the techniques involved to do this task ?

    I use ffmpeg-3.1 c-source code grabbed from github. i refer this sample it doesn’t guide me.

    If anybody experienced this problem please guide me right direction.