Recherche avancée

Médias (0)

Mot : - Tags -/formulaire

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (98)

  • Organiser par catégorie

    17 mai 2013, par

    Dans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
    Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
    Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

Sur d’autres sites (10157)

  • How to send large x264 NAL over RTMP ?

    17 septembre 2017, par samgak

    I’m trying to stream video over RTMP using x264 and rtmplib in C++ on Windows.

    So far I have managed to encode and stream a test video pattern consisting of animated multi-colored vertical lines that I generate in code. It’s possible to start and stop the stream, and start and stop the player, and it works every time. However, as soon as I modify it to send encoded camera frames instead of the test pattern, the streaming becomes very unreliable. It only starts <20% of the time, and stopping and restarting doesn’t work.

    After searching around for answers I concluded that it must be because the NAL size is too large (my test pattern is mostly flat color so it encodes to a very small size), and there is an Ethernet packet limit of around 1400 bytes that affects it. So, I tried to make x264 only output NALs under 1200 bytes, by setting i_slice_max_size in my x264 setup :

    if (x264_param_default_preset(&amp;param, "veryfast", "zerolatency") &lt; 0)
       return false;
    param.i_csp = X264_CSP_I420;
    param.i_threads = 1;
    param.i_width = width;  //set frame width
    param.i_height = height;  //set frame height
    param.b_cabac = 0;
    param.i_bframe = 0;
    param.b_interlaced = 0;
    param.rc.i_rc_method = X264_RC_ABR;
    param.i_level_idc = 21;
    param.rc.i_bitrate = 128;
    param.b_intra_refresh = 1;
    param.b_annexb = 1;
    param.i_keyint_max = 25;
    param.i_fps_num = 15;
    param.i_fps_den = 1;

    param.i_slice_max_size = 1200;

    if (x264_param_apply_profile(&amp;param, "baseline") &lt; 0)
       return false;

    This reduces the NAL size, but it doesn’t seem to make any difference to the reliability issues.

    I’ve also tried fragmenting the NALs, using this Java code and RFC 3984 (RTP Payload Format for H.264 Video) as a reference, but it doesn’t work at all (code below), the server says "stream has stopped" immediately after it starts. I’ve tried including and excluding the NAL header (with the timestamp etc) in each fragment or just the first, but it doesn’t work for me either way.

    I’m pretty sure my issue has to be with the NAL size and not PPS/SPS or anything like that (as in this question) or with my network connection or test server, because everything works fine with the test pattern.

    I’m sending NAL_PPS and NAL_SPS (only once), and all NAL_SLICE_IDR and NAL_SLICE packets. I’m ignoring NAL_SEI and not sending it.

    One thing that is confusing me is that the source code that I can find on the internet that does similar things to what I want doesn’t match up with what the RFC specifies. For example, RFC 3984 section 5.3 defines the NAL octet, which should have the NAL type in the lower 5 bits and the NRI in bits 5 and 6 (bit 7 is zero). The types NAL_SLICE_IDR and NAL_SLICE have values of 5 and 1 respectively, which are the ones in table 7-1 of this document (PDF) referenced by the RFC and also the ones output by x264. But the code that actually works sets the NAL octet to 39 (0x27) and 23 (0x17), for reasons unknown to me. When implementing fragmented NALs, I’ve tried both following the spec and using the values copied over from the working code, but neither works.

    Any help appreciated.

    void sendNAL(unsigned char* buf, int len)
    {
       Logging::LogNumber("sendNAL", len);
       RTMPPacket * packet;
       long timeoffset = GetTickCount() - startTime;

       if (buf[2] == 0x00) { //00 00 00 01
           buf += 4;
           len -= 4;
       }
       else if (buf[2] == 0x01) { //00 00 01
           buf += 3;
           len -= 3;
       }
       else
       {
           Logging::LogStdString("INVALID x264 FRAME!");
       }
       int type = buf[0] &amp; 0x1f;  
       int maxNALSize = 1200;

       if (len &lt;= maxNALSize)
       {
           packet = (RTMPPacket *)malloc(RTMP_HEAD_SIZE + len + 9);
           memset(packet, 0, RTMP_HEAD_SIZE);

           packet->m_body = (char *)packet + RTMP_HEAD_SIZE;
           packet->m_nBodySize = len + 9;

           unsigned char *body = (unsigned char *)packet->m_body;
           memset(body, 0, len + 9);

           body[0] = 0x27;
           if (type == NAL_SLICE_IDR) {
               body[0] = 0x17;
           }

           body[1] = 0x01;   //nal unit
           body[2] = 0x00;
           body[3] = 0x00;
           body[4] = 0x00;

           body[5] = (len >> 24) &amp; 0xff;
           body[6] = (len >> 16) &amp; 0xff;
           body[7] = (len >> 8) &amp; 0xff;
           body[8] = (len) &amp; 0xff;

           memcpy(&amp;body[9], buf, len);

           packet->m_hasAbsTimestamp = 0;
           packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;
           if (rtmp != NULL) {
               packet->m_nInfoField2 = rtmp->m_stream_id;
           }
           packet->m_nChannel = 0x04;
           packet->m_headerType = RTMP_PACKET_SIZE_LARGE;
           packet->m_nTimeStamp = timeoffset;

           if (rtmp != NULL) {
               RTMP_SendPacket(rtmp, packet, QUEUE_RTMP);
           }
           free(packet);
       }
       else
       {
           packet = (RTMPPacket *)malloc(RTMP_HEAD_SIZE + maxNALSize + 90);
           memset(packet, 0, RTMP_HEAD_SIZE);      

           // split large NAL into multiple smaller ones:
           int sentBytes = 0;
           bool firstFragment = true;
           while (sentBytes &lt; len)
           {
               // decide how many bytes to send in this fragment:
               int fragmentSize = maxNALSize;
               if (sentBytes + fragmentSize > len)
                   fragmentSize = len - sentBytes;
               bool lastFragment = (sentBytes + fragmentSize) >= len;

               packet->m_body = (char *)packet + RTMP_HEAD_SIZE;
               int headerBytes = firstFragment ? 10 : 2;
               packet->m_nBodySize = fragmentSize + headerBytes;

               unsigned char *body = (unsigned char *)packet->m_body;
               memset(body, 0, fragmentSize + headerBytes);

               //key frame
               int NALtype = 0x27;
               if (type == NAL_SLICE_IDR) {
                   NALtype = 0x17;
               }

               // Set FU-A indicator
               body[0] = (byte)((NALtype &amp; 0x60) &amp; 0xFF); // FU indicator NRI
               body[0] += 28; // 28 = FU - A (fragmentation unit A) see RFC: https://tools.ietf.org/html/rfc3984

               // Set FU-A header
               body[1] = (byte)(NALtype &amp; 0x1F);  // FU header type
               body[1] += (firstFragment ? 0x80 : 0) + (lastFragment ? 0x40 : 0); // Start/End bits

               body[2] = 0x01;   //nal unit
               body[3] = 0x00;
               body[4] = 0x00;
               body[5] = 0x00;

               body[6] = (len >> 24) &amp; 0xff;
               body[7] = (len >> 16) &amp; 0xff;
               body[8] = (len >> 8) &amp; 0xff;
               body[9] = (len) &amp; 0xff;

               //copy data
               memcpy(&amp;body[headerBytes], buf + sentBytes, fragmentSize);

               packet->m_hasAbsTimestamp = 0;
               packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;
               if (rtmp != NULL) {
                   packet->m_nInfoField2 = rtmp->m_stream_id;
               }
               packet->m_nChannel = 0x04;
               packet->m_headerType = RTMP_PACKET_SIZE_LARGE;
               packet->m_nTimeStamp = timeoffset;

               if (rtmp != NULL) {
                   RTMP_SendPacket(rtmp, packet, TRUE);
               }

               sentBytes += fragmentSize;
               firstFragment = false;
           }

           free(packet);
       }
    }
  • Why is image loading from ffmpeg using stdout slow ?

    23 octobre 2017, par Jirka Šáda

    I was just experimenting with FFMPEG, and wrote this testing piece of code, that is reading frames from FFMPEG and displaying them :

    public partial class MainWindow : Window
    {
       private WriteableBitmap wbm = new WriteableBitmap(1920, 1080, 96, 96, PixelFormats.Rgb24, null);
       private IntPtr backBuffer;
       private int totalFrames = 0;
       private bool drawing = false;

       public MainWindow()
       {
           InitializeComponent();

           backBuffer = wbm.BackBuffer;

           Thread t = new Thread(new ThreadStart(RunFFMPEG));
           t.Start();

           Thread measurer = new Thread(new ThreadStart(Measure));
           measurer.Start();
       }

       private void Measure()
       {
           while(true)
           {
               Thread.Sleep(1000);
               Console.Title = "FPS: " + totalFrames;
               totalFrames = 0;
           }
       }

       private void UpdateBitmap()
       {
           drawing = true;

           wbm.Lock();
           wbm.AddDirtyRect(new Int32Rect(0, 0, 1920, 1080));
           wbm.Unlock();

           mainImage.Source = wbm;

           drawing = false;
       }

       private void RunFFMPEG()
       {
           totalFrames = 0;

           int length = 1920 * 1080 * 3;

           // start FFMPEG process and pipe standart output
           ProcessStartInfo info = new ProcessStartInfo("ffmpeg.exe", "-f dshow -vcodec mjpeg -video_size 1920x1080 -framerate 30 -i video=\"A4 TECH HD PC Camera\" -y -pix_fmt rgb24 -f rawvideo -framerate 30 pipe:1");
           info.WindowStyle = ProcessWindowStyle.Hidden;
           info.UseShellExecute = false;
           info.CreateNoWindow = true;
           info.RedirectStandardOutput = true;

           Process p = new Process();
           p.StartInfo = info;
           p.Start();

           while (true) // read frames from standart output
           {

               byte[] data = new byte[length];

               for (int i = 0; i &lt; length; i++) // read byte by byte, if we call normal Read(), the returned array is always mostly empty
                   data[i] = (byte)p.StandardOutput.BaseStream.ReadByte();

               if (!drawing)
               {
                   Marshal.Copy(data, 0, backBuffer, length);
                   Dispatcher.Invoke(() => UpdateBitmap()); // we are not in UI thread
               }

               totalFrames++;
           }
       }  
    }

    The problem is, although I get around 17-20 FPS reported, the Image is redrawing very very slowly (once or twice per a second) and even though the mainImage.Source =... line takes always similar amount of time, the speed at which the Image visually updates itself is getting progressively slower. What could be the cause ?? Thanks for any tips

    PS : I know the code is ugly and not following MVVM pattern, but it’s just and only for testing, please don’t hate me :-)

    Edit : The problem is not speed at which I’m reading from stdout, that’s fine (20 FPS), the problem is, that WPF Image is visually updating very slowly

  • ffmpeg with many input and filters

    11 novembre 2017, par Nabi K.A.Z.

    I use this command :

    ffmpeg -i main.mp3 -i 1.mp3 -i 2.mp3 -i 3.mp3 \
          -filter_complex "\
             [0]volume=1[a0]; \
             [1]adelay=1000[a1]; \
             [2]adelay=2000[a2]; \
             [3]adelay=3000[a3]; \
             [a0][a1][a2][a3]amix=4:duration=first" \
    -y merged.mp3

    But I have many input. more than 500. so my pattern is be like this :

    ffmpeg -i main.mp3 -i 1.mp3 -i 2.mp3 -i 3.mp3 ... [n.mp3] \
          -filter_complex "\
             [0]volume=1[a0]; \
             [1]adelay=1000[a1]; \
             [2]adelay=2000[a2]; \
             [3]adelay=3000[a3]; \
             :
             :
             [n]adelay=1000000[aN]; \
             [a0][a1][a2][a3] ... [aN] amix=[n]:duration=first" \
    -y merged.mp3

    But I get errors in any platform.

    Linux :

    :
    :
    Input #200, mp3, from 'sounds2/200.mp3':
     Metadata:
       encoder         : Lavf56.25.101
     Duration: 00:00:02.95, start: 0.069063, bitrate: 24 kb/s
       Stream #200:0: Audio: mp3, 16000 Hz, mono, s16p, 24 kb/s
    [amix @ 0x3cffb20] Value 201.000000 for parameter 'inputs' out of range [1 - 32]
       Last message repeated 1 times
    [amix @ 0x3cffb20] Error setting option inputs to value 201.
    [Parsed_amix_201 @ 0x3cffa60] Error applying options to the filter.
    [AVFilterGraph @ 0x3b44980] Error initializing filter 'amix' with args '201:duration=first'
    Error configuring filters.

    Cygwin :

     Stream #199:0 (mp3) -> adelay
     Stream #200:0 (mp3) -> adelay
     amix -> Stream #0:0 (aac)
    Press [q] to stop, [?] for help
    [Parsed_adelay_171 @ 00000000041fc0c0] Failed to configure input pad on Parsed_adelay_171
    Error reinitializing filters!
    Failed to inject frame into filter network: Cannot allocate memory
    Error while processing the decoded data for stream #200:0
    Conversion failed!

    Windows :

    The command line is too long.

    What’s your solution ?