Recherche avancée

Médias (91)

Autres articles (66)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (8208)

  • Error : "libavformat\avformat.h : No such file or directory"

    21 février 2023, par Abdo Daood

    I create a simple c++ file called test.cpp :

    


    #include <iostream>&#xA;#include &#xA;using namespace std;&#xA;&#xA;int main() &#xA;{&#xA;    cout &lt;&lt; "Hello, World!";&#xA;    return 0;&#xA;}&#xA;</iostream>

    &#xA;

    and using g++ in terminal to compile test.cpp file with the command :

    &#xA;

    g&#x2B;&#x2B; test.cpp -o test&#xA;

    &#xA;

    I get this error :

    &#xA;

    test.cpp:2:10: fatal error: libavformat\avformat.h: No such file or directory&#xA;    2 | #include &#xA;      |          ^~~~~~~~~~~~~~~~~~~~~~~~&#xA;compilation terminated.&#xA;

    &#xA;

    Notes :&#xA;libavformat\avformat.h path is : /usr/include/x86_64-linux-gnu/libavformat/avformat.h

    &#xA;

    and using ffmpeg version is :

    &#xA;

    $ ffmpeg -version&#xA;&#xA;ffmpeg version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2000-2021 the FFmpeg developers&#xA;built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)&#xA;configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared&#xA;libavutil      56. 70.100 / 56. 70.100&#xA;libavcodec     58.134.100 / 58.134.100&#xA;libavformat    58. 76.100 / 58. 76.100&#xA;libavdevice    58. 13.100 / 58. 13.100&#xA;libavfilter     7.110.100 /  7.110.100&#xA;libswscale      5.  9.100 /  5.  9.100&#xA;libswresample   3.  9.100 /  3.  9.100&#xA;libpostproc    55.  9.100 / 55.  9.100&#xA;

    &#xA;

    Using this command :

    &#xA;

    g&#x2B;&#x2B; `pkg-config --cflags libavformat` test.cpp  `pkg-config --libs libavformat` -o test&#xA;

    &#xA;

  • av_read_frame and time stamps C++

    19 avril 2014, par halfwaythru

    I am recording an RTSP H264 stream from an Axis IP camera using libavformat. This camera is supposed to stamp every frame with the time that the frame was acquired, and this information is supposed to be in the RTP packet header.

    This is the code that I am using to read in the frames.

    AVFormatContext *inputFormatCtxt = NULL;
    avformat_open_input(&amp;inputFormatCtxt, inputURL, NULL, NULL)
    avformat_find_stream_info(inputFormatCtxt, NULL )

    while(av_read_frame(inputFormatCtxt, &amp;packet) >=0)
    {
       if(packet.stream_index == videoStreamIndex)
       {
          // Do something to video packet.
       }
       else
       {
          // Do something to audio packet.
       }

       if (packet.pts != AV_NOPTS_VALUE)
           packet.dts = packet.pts    = av_rescale_q(packet.pts, stream->time_base, oStream->time_base);
       else
           NSLog(@"packet.pts == AV_NOPTS_VALUE");

       if(av_interleaved_write_frame(outputFormatCtxt, &amp;packet) &lt; 0)
           NSLog(@"Could not write out frame.");

       av_free_packet(&amp;packet);
    }

    Now in AVPacket, the only time-related information is the pts and the dts. After converting them into seconds, these are supposed to be the offset of the packet (in seconds) from the start of the stream.

    My question is : How do I get the start time of the stream ?

    These are the many things that I have tried :

    1.) In AVFormatContext there is a variable start_time_realtime that is "supposed" to be the start time of the stream in real world time, in microseconds. This is exactly what I need. But no matter what I do, this value is always 0, and never changes. Am I missing a step in initialization that this never get set ?

    2.) Looking at this link, I added an RTPDemuxContext object to my code :

    RTSPState* rtsp_state = (RTSPState*) inputFormatCtxt->priv_data;
    RTSPStream* rtsp_stream = rtsp_state->rtsp_streams[0];
    RTPDemuxContext* rtp_demux_context = (RTPDemuxContext*) rtsp_stream->transport_priv;

    When I tried to look at the last_rtcp_reception_time, last_rtcp_ntp_time, last_rtcp_timestamp timestamps within the RTPDemuxContext object, these values are also 0 always, and dont change.

    3.) With the last point, I tried to force fetch a packet using ff_rtsp_fetch_packet(inputFormatCtxt, &amp;packet). This did update the RTPDemuxContext timestamps, but only while stepping through code. If I just ran the code in a loop, it always remained the same as whatever was in the RTDemuxContext object before the loop.

    int64_t x = 0;
    x = rtp_demux_context->last_rtcp_reception_time;   // x is 0.
    while(ff_rtsp_fetch_packet(inputFormatCtxt, &amp;packet))
    {
       x = rtp_demux_context->last_rtcp_reception_time;   // x changes only when stepping through code. else remains 0
    }

    At this point I have no idea what I am doing wrong. I cant seem to get this timestamp information, no matter what I try. Any help is much appreciated. Thanks !

  • Ffmpeg set duration using node-fluent-ffmpeg

    23 mai 2013, par Vprnl

    I'm really new to the world of ffmpeg so please excuses me if this is a stupid queston.

    I'm using the module Node-fluent-ffmpeg to stream a movie and convert it from avi to webm.
    So far so good (it plays the video), but I'm having trouble parsing the duration to the player. My ultimate goal is to be able to skip ahead in the movie. But first the player needs to know how long the video is.

    my code is as followed :

    var stat = fs.statSync(movie);

    var start = 0;
    var end = 0;
    var range = req.header(&#39;Range&#39;);
    if (range != null) {
    start = parseInt(range.slice(range.indexOf(&#39;bytes=&#39;)+6,
     range.indexOf(&#39;-&#39;)));
    end = parseInt(range.slice(range.indexOf(&#39;-&#39;)+1,
     range.length));
    }
    if (isNaN(end) || end == 0) end = stat.size-1;
    if (start > end) return;

    res.writeHead(206, { // NOTE: a partial http response
       &#39;Connection&#39;:&#39;close&#39;,
       &#39;Content-Type&#39;:&#39;video/webm&#39;,
       &#39;Content-Length&#39;:end - start,
       &#39;Content-Range&#39;:&#39;bytes &#39;+start+&#39;-&#39;+end+&#39;/&#39;+stat.size,
       &#39;Transfer-Encoding&#39;:&#39;chunked&#39;
    });

    var  proc = new ffmpeg({ source: movie, nolog: true, priority: 1, timeout:15000})
       .toFormat(&#39;webm&#39;)
       .withVideoBitrate(&#39;1024k&#39;)
       .addOptions([&#39;-probesize 900000&#39;, &#39;-analyzeduration 0&#39;, &#39;-bufsize 14000&#39;])
       .writeToStream(res, function(retcode, error){
       if (!error){
           console.log(&#39;file has been converted succesfully&#39;,retcode);
       }else{
           console.log(&#39;file conversion error&#39;,error);
       }
    });

    I tried to set the header with a start and a end based on this article : http://delog.wordpress.com/2011/04/25/stream-webm-file-to-chrome-using-node-js/

    I also looked in the FFmpeg documentation and found -f duration and -ss.
    But I don't quite know how to convert the byte range to seconds.

    I feel like I'm pretty close to a solution but my inexperience with the subject matter prohibits me from getting it to work. If I'm unclear in any way please let me know. (I have a tendency of explaining things fuzzy.)

    Thanks in advance !