Recherche avancée

Médias (1)

Mot : - Tags -/getid3

Autres articles (74)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

Sur d’autres sites (9722)

  • Revision 33026 : Ajout des classes mots, breves, auteurs ... dans les extras

    17 novembre 2009, par shnoulle@… — Log

    Ajout des classes mots, breves, auteurs ... dans les extras

  • How to send large x264 NAL over RTMP ?

    17 septembre 2017, par samgak

    I’m trying to stream video over RTMP using x264 and rtmplib in C++ on Windows.

    So far I have managed to encode and stream a test video pattern consisting of animated multi-colored vertical lines that I generate in code. It’s possible to start and stop the stream, and start and stop the player, and it works every time. However, as soon as I modify it to send encoded camera frames instead of the test pattern, the streaming becomes very unreliable. It only starts <20% of the time, and stopping and restarting doesn’t work.

    After searching around for answers I concluded that it must be because the NAL size is too large (my test pattern is mostly flat color so it encodes to a very small size), and there is an Ethernet packet limit of around 1400 bytes that affects it. So, I tried to make x264 only output NALs under 1200 bytes, by setting i_slice_max_size in my x264 setup :

    if (x264_param_default_preset(&amp;param, "veryfast", "zerolatency") &lt; 0)
       return false;
    param.i_csp = X264_CSP_I420;
    param.i_threads = 1;
    param.i_width = width;  //set frame width
    param.i_height = height;  //set frame height
    param.b_cabac = 0;
    param.i_bframe = 0;
    param.b_interlaced = 0;
    param.rc.i_rc_method = X264_RC_ABR;
    param.i_level_idc = 21;
    param.rc.i_bitrate = 128;
    param.b_intra_refresh = 1;
    param.b_annexb = 1;
    param.i_keyint_max = 25;
    param.i_fps_num = 15;
    param.i_fps_den = 1;

    param.i_slice_max_size = 1200;

    if (x264_param_apply_profile(&amp;param, "baseline") &lt; 0)
       return false;

    This reduces the NAL size, but it doesn’t seem to make any difference to the reliability issues.

    I’ve also tried fragmenting the NALs, using this Java code and RFC 3984 (RTP Payload Format for H.264 Video) as a reference, but it doesn’t work at all (code below), the server says "stream has stopped" immediately after it starts. I’ve tried including and excluding the NAL header (with the timestamp etc) in each fragment or just the first, but it doesn’t work for me either way.

    I’m pretty sure my issue has to be with the NAL size and not PPS/SPS or anything like that (as in this question) or with my network connection or test server, because everything works fine with the test pattern.

    I’m sending NAL_PPS and NAL_SPS (only once), and all NAL_SLICE_IDR and NAL_SLICE packets. I’m ignoring NAL_SEI and not sending it.

    One thing that is confusing me is that the source code that I can find on the internet that does similar things to what I want doesn’t match up with what the RFC specifies. For example, RFC 3984 section 5.3 defines the NAL octet, which should have the NAL type in the lower 5 bits and the NRI in bits 5 and 6 (bit 7 is zero). The types NAL_SLICE_IDR and NAL_SLICE have values of 5 and 1 respectively, which are the ones in table 7-1 of this document (PDF) referenced by the RFC and also the ones output by x264. But the code that actually works sets the NAL octet to 39 (0x27) and 23 (0x17), for reasons unknown to me. When implementing fragmented NALs, I’ve tried both following the spec and using the values copied over from the working code, but neither works.

    Any help appreciated.

    void sendNAL(unsigned char* buf, int len)
    {
       Logging::LogNumber("sendNAL", len);
       RTMPPacket * packet;
       long timeoffset = GetTickCount() - startTime;

       if (buf[2] == 0x00) { //00 00 00 01
           buf += 4;
           len -= 4;
       }
       else if (buf[2] == 0x01) { //00 00 01
           buf += 3;
           len -= 3;
       }
       else
       {
           Logging::LogStdString("INVALID x264 FRAME!");
       }
       int type = buf[0] &amp; 0x1f;  
       int maxNALSize = 1200;

       if (len &lt;= maxNALSize)
       {
           packet = (RTMPPacket *)malloc(RTMP_HEAD_SIZE + len + 9);
           memset(packet, 0, RTMP_HEAD_SIZE);

           packet->m_body = (char *)packet + RTMP_HEAD_SIZE;
           packet->m_nBodySize = len + 9;

           unsigned char *body = (unsigned char *)packet->m_body;
           memset(body, 0, len + 9);

           body[0] = 0x27;
           if (type == NAL_SLICE_IDR) {
               body[0] = 0x17;
           }

           body[1] = 0x01;   //nal unit
           body[2] = 0x00;
           body[3] = 0x00;
           body[4] = 0x00;

           body[5] = (len >> 24) &amp; 0xff;
           body[6] = (len >> 16) &amp; 0xff;
           body[7] = (len >> 8) &amp; 0xff;
           body[8] = (len) &amp; 0xff;

           memcpy(&amp;body[9], buf, len);

           packet->m_hasAbsTimestamp = 0;
           packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;
           if (rtmp != NULL) {
               packet->m_nInfoField2 = rtmp->m_stream_id;
           }
           packet->m_nChannel = 0x04;
           packet->m_headerType = RTMP_PACKET_SIZE_LARGE;
           packet->m_nTimeStamp = timeoffset;

           if (rtmp != NULL) {
               RTMP_SendPacket(rtmp, packet, QUEUE_RTMP);
           }
           free(packet);
       }
       else
       {
           packet = (RTMPPacket *)malloc(RTMP_HEAD_SIZE + maxNALSize + 90);
           memset(packet, 0, RTMP_HEAD_SIZE);      

           // split large NAL into multiple smaller ones:
           int sentBytes = 0;
           bool firstFragment = true;
           while (sentBytes &lt; len)
           {
               // decide how many bytes to send in this fragment:
               int fragmentSize = maxNALSize;
               if (sentBytes + fragmentSize > len)
                   fragmentSize = len - sentBytes;
               bool lastFragment = (sentBytes + fragmentSize) >= len;

               packet->m_body = (char *)packet + RTMP_HEAD_SIZE;
               int headerBytes = firstFragment ? 10 : 2;
               packet->m_nBodySize = fragmentSize + headerBytes;

               unsigned char *body = (unsigned char *)packet->m_body;
               memset(body, 0, fragmentSize + headerBytes);

               //key frame
               int NALtype = 0x27;
               if (type == NAL_SLICE_IDR) {
                   NALtype = 0x17;
               }

               // Set FU-A indicator
               body[0] = (byte)((NALtype &amp; 0x60) &amp; 0xFF); // FU indicator NRI
               body[0] += 28; // 28 = FU - A (fragmentation unit A) see RFC: https://tools.ietf.org/html/rfc3984

               // Set FU-A header
               body[1] = (byte)(NALtype &amp; 0x1F);  // FU header type
               body[1] += (firstFragment ? 0x80 : 0) + (lastFragment ? 0x40 : 0); // Start/End bits

               body[2] = 0x01;   //nal unit
               body[3] = 0x00;
               body[4] = 0x00;
               body[5] = 0x00;

               body[6] = (len >> 24) &amp; 0xff;
               body[7] = (len >> 16) &amp; 0xff;
               body[8] = (len >> 8) &amp; 0xff;
               body[9] = (len) &amp; 0xff;

               //copy data
               memcpy(&amp;body[headerBytes], buf + sentBytes, fragmentSize);

               packet->m_hasAbsTimestamp = 0;
               packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;
               if (rtmp != NULL) {
                   packet->m_nInfoField2 = rtmp->m_stream_id;
               }
               packet->m_nChannel = 0x04;
               packet->m_headerType = RTMP_PACKET_SIZE_LARGE;
               packet->m_nTimeStamp = timeoffset;

               if (rtmp != NULL) {
                   RTMP_SendPacket(rtmp, packet, TRUE);
               }

               sentBytes += fragmentSize;
               firstFragment = false;
           }

           free(packet);
       }
    }
  • Streaming video from an image using FFMPEG on Windows

    26 mai 2013, par Daniel Zohar

    I wrote a program that simulates a camera and converts the output into a video stream. The program is required to be able to run on Windows.
    There are two components in the system :

    1. Camera Simulator. A C++ program that simulates the camera. It copies a pre-generated frame (i.e. PNG file) every 0.1 seconds, using the windows copy command, to a destination path ./target/target_image.png
    2. Video Stream. Using FFmpeg, it creates a video stream out of the copied images. FFmpeg is ran with the following command :
      ffmpeg -loop 1 -i ./target/target_image.png -r 10 -vcodec mpeg4 -f mpegts udp://127.0.0.1:1234

    When running the whole thing together, it works fine for a few seconds until the ffmpeg halts. Here is a log while running in debug mode :

    ffmpeg version N-52458-gaa96439 Copyright (c) 2000-2013 the FFmpeg developers
     built on Apr 24 2013 22:19:32 with gcc 4.8.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable-libgsm --enable-libilbc --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --enable-zlib
     libavutil      52. 27.101 / 52. 27.101
     libavcodec     55.  6.100 / 55.  6.100
     libavformat    55.  3.100 / 55.  3.100
     libavdevice    55.  0.100 / 55.  0.100
     libavfilter     3. 60.101 /  3. 60.101
     libswscale      2.  2.100 /  2.  2.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Splitting the commandline.
    Reading option &#39;-loop&#39; ... matched as AVOption &#39;loop&#39; with argument &#39;1&#39;.
    Reading option &#39;-i&#39; ... matched as input file with argument &#39;./target/target_image.png&#39;.
    Reading option &#39;-r&#39; ... matched as option &#39;r&#39; (set frame rate (Hz value, fraction or abbreviation)) with argument &#39;10&#39;.
    Reading option &#39;-vcodec&#39; ... matched as option &#39;vcodec&#39; (force video codec (&#39;copy&#39; to copy stream)) with argument &#39;mpeg4&#39;.
    Reading option &#39;-f&#39; ... matched as option &#39;f&#39; (force format) with argument &#39;mpegts&#39;.
    Reading option &#39;udp://127.0.0.1:1234&#39; ... matched as output file.
    Reading option &#39;-loglevel&#39; ... matched as option &#39;loglevel&#39; (set logging level) with argument &#39;debug&#39;.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option loglevel (set logging level) with argument debug.
    Successfully parsed a group of options.
    Parsing a group of options: input file ./target/target_image.png.
    Successfully parsed a group of options.
    Opening an input file: ./target/target_image.png.
    [AVIOContext @ 02678840] Statistics: 234307 bytes read, 0 seeks
    [AVIOContext @ 02678840] Statistics: 221345 bytes read, 0 seeks
       Last message repeated 1 times
    [AVIOContext @ 02678840] Statistics: 226329 bytes read, 0 seeks
       Last message repeated 2 times
    [AVIOContext @ 02678840] Statistics: 228676 bytes read, 0 seeks
       Last message repeated 2 times
    [AVIOContext @ 02678840] Statistics: 230685 bytes read, 0 seeks
       Last message repeated 2 times
    [AVIOContext @ 02678840] Statistics: 232697 bytes read, 0 seeks
       Last message repeated 5 times
    [AVIOContext @ 02678840] Statistics: 234900 bytes read, 0 seeks
       Last message repeated 2 times
    [AVIOContext @ 02678840] Statistics: 236847 bytes read, 0 seeks
    [image2 @ 02677ac0] Probe buffer size limit of 5000000 bytes reached
    Input #0, image2, from &#39;./target/target_image.png&#39;:
     Duration: 00:00:00.04, start: 0.000000, bitrate: N/A
       Stream #0:0, 22, 1/25: Video: png, rgb24, 1274x772 [SAR 1:1 DAR 637:386], 1/25, 25 fps, 25 tbr, 25 tbn, 25 tbc
    Successfully opened the file.
    Parsing a group of options: output file udp://127.0.0.1:1234.
    Applying option r (set frame rate (Hz value, fraction or abbreviation)) with argument 10.
    Applying option vcodec (force video codec (&#39;copy&#39; to copy stream)) with argument mpeg4.
    Applying option f (force format) with argument mpegts.
    Successfully parsed a group of options.
    Opening an output file: udp://127.0.0.1:1234.
    Successfully opened the file.
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;video_size&#39; to value &#39;1274x772&#39;
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;pix_fmt&#39; to value &#39;2&#39;
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;time_base&#39; to value &#39;1/25&#39;
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;pixel_aspect&#39; to value &#39;1/1&#39;
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;sws_param&#39; to value &#39;flags=2&#39;
    [graph 0 input from stream 0:0 @ 02769280] Setting &#39;frame_rate&#39; to value &#39;25/1&#39;
    [graph 0 input from stream 0:0 @ 02769280] w:1274 h:772 pixfmt:rgb24 tb:1/25 fr:25/1 sar:1/1 sws_param:flags=2
    [format @ 02768ba0] compat: called with args=[yuv420p]
    [format @ 02768ba0] Setting &#39;pix_fmts&#39; to value &#39;yuv420p&#39;
    [auto-inserted scaler 0 @ 02768740] Setting &#39;w&#39; to value &#39;0&#39;
    [auto-inserted scaler 0 @ 02768740] Setting &#39;h&#39; to value &#39;0&#39;
    [auto-inserted scaler 0 @ 02768740] Setting &#39;flags&#39; to value &#39;0x4&#39;
    [auto-inserted scaler 0 @ 02768740] w:0 h:0 flags:&#39;0x4&#39; interl:0
    [format @ 02768ba0] auto-inserting filter &#39;auto-inserted scaler 0&#39; between the filter &#39;Parsed_null_0&#39; and the filter &#39;format&#39;
    [AVFilterGraph @ 026772c0] query_formats: 4 queried, 3 merged, 1 already done, 0 delayed
    [auto-inserted scaler 0 @ 02768740] w:1274 h:772 fmt:rgb24 sar:1/1 -> w:1274 h:772 fmt:yuv420p sar:1/1 flags:0x4
    [mpeg4 @ 02785020] detected 4 logical cores
    [mpeg4 @ 02785020] intra_quant_bias = 0 inter_quant_bias = -64
    [mpegts @ 0277da40] muxrate VBR, pcr every 1 pkts, sdt every 200, pat/pmt every 40 pkts
    Output #0, mpegts, to &#39;udp://127.0.0.1:1234&#39;:
     Metadata:
       encoder         : Lavf55.3.100
       Stream #0:0, 0, 1/90000: Video: mpeg4, yuv420p, 1274x772 [SAR 1:1 DAR 637:386], 1/10, q=2-31, 200 kb/s, 90k tbn, 10 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (png -> mpeg4)
    Press [q] to stop, [?] for help
    *** drop!
       Last message repeated 10 times
    frame=   11 fps=0.0 q=4.0 size=     118kB time=00:00:01.10 bitrate= 875.1kbits/s dup=0 drop=11    
    Statistics: 242771 bytes read, 0 seeks
    [AVIOContext @ 02674a60] Statistics: 246525 bytes read, 0 seeks
    *** drop!
    [AVIOContext @ 02674a60] Statistics: 230678 bytes read, 0 seeks
    [AVIOContext @ 02674a60] Statistics: 244023 bytes read, 0 seeks
    *** drop!
    [AVIOContext @ 02674a60] Statistics: 246389 bytes read, 0 seeks

    *** drop!
    [AVIOContext @ 02674a60] Statistics: 224478 bytes read, 0 seeks
    [AVIOContext @ 02674a60] Statistics: 228013 bytes read, 0 seeks
    *** drop!
    [image2 @ 02677ac0] Could not open file : ./target/target_image.png
    ./target/target_image.png: Input/output error
    [output stream 0:0 @ 02768c20] EOF on sink link output stream 0:0:default.
    No more output streams to write to, finishing.
    frame=  164 fps= 17 q=31.0 Lsize=     959kB time=00:00:16.40 bitrate= 478.9kbits/s dup=0 drop=240    

    video:869kB audio:0kB subtitle:0 global headers:0kB muxing overhead 10.285235%
    404 frames successfully decoded, 0 decoding errors
    [AVIOContext @ 026779c0] Statistics: 0 seeks, 746 writeouts

    It seems to me there's some kind of collision between the reading and writing to/from the same file. What's also interesting is that on Linux (while replacing the copy with cp) the program works just fine.

    Can someone suggest a way to solve this issue ? Alternatives solutions are also acceptable as long as the logical workflow remains the same.