Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (40)

  • Qu’est ce qu’un éditorial

    21 juin 2013, par

    Ecrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
    Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
    Vous pouvez personnaliser le formulaire de création d’un éditorial.
    Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Les images

    15 mai 2013

Sur d’autres sites (8086)

  • Custom IO writes only header, rest of frames seems omitted

    18 septembre 2023, par Daniel

    I'm using libavformat to read packets from rtsp and remux it to mp4 (fragmented).

    


    Video frames are intact, meaning I don't want to transcode/modify/change anything.
Video frames shall be remuxed into mp4 in their original form. (i.e. : NALUs shall remain the same).

    


    I have updated libavformat to latest (currently 4.4).

    


    Here is my snippet :

    


    //open input, probesize is set to 32, we don't need to decode anything
avformat_open_input

//open output with custom io
avformat_alloc_output_context2(&ofctx,...);
ofctx->pb = avio_alloc_context(buffer, bufsize, 1/*write flag*/, 0, 0, &writeOutput, 0);
ofctx->flags |= AVFMT_FLAG_NOBUFFER | AVFMT_FLAG_FLUSH_PACKETS | AVFMT_FLAG_CUSTOM_IO;

avformat_write_header(...);

//loop
av_read_frame()
LOGPACKET_DETAILS //<- this works, packets are coming
av_write_frame() //<- this doesn't work, my write callback is not called. Also tried with av_write_interleaved_frame, not seem to work.

int writeOutput(void *opaque, uint8_t *buffer, int buffer_size) {
  printf("writeOutput: writing %d bytes: ", buffer_size);
}


    


    avformat_write_header works, it prints the header correctly.

    


    I'm looking for the reason on why my custom IO is not called after a frame has been read.

    


    There must be some more flags should be set to ask avformat to don't care about decoding, just write out whatever comes in.

    


    More information :
Input stream is a VBR encoded H264. It seems av_write_frame calls my write function only in case an SPS, PPS or IDR frame. Non-IDR frames are not passed at all.

    


    Update

    


    I found out if I request IDR frame at every second (I can ask it from the encoder), writeOutput is called at every second.

    


    I created a test : after a client joins, I requested the encoder to create IDRs @1Hz for 10 times. Libav calls writeOutput at 1Hz for 10 seconds, but then encoder sets itself back to create IDR only at every 10 seconds. And then libav calls writeOutput only at every 10s, which makes my decoder fail. In case 1Hz IDRs, decoder is fine.

    


  • Stream OpenGL framebuffer over HTTP (via FFmpeg)

    16 juin 2022, par mOfl

    I have an OpenGL application of which rendered images need to be streamed over internet to mobile clients. Previously, it sufficed to simply record the rendering into a video file, which is already working, and now this should be extended to subsequent streaming.

    



    What is working right now :

    



      

    • Render a scene to an OpenGL framebuffer object
    • 


    • Capture the FBO content using NvIFR
    • 


    • Encode it to H.264 using NvENC (no CPU round trip required)
    • 


    • Download the encoded frame to host memory as a byte array
    • 


    • Append this frame to a video file
    • 


    



    None of this steps involves FFmpeg or any other library so far. I now want to replace the last step with "Stream the current frame's byte array over internet" and I assume that using FFmpeg and FFserver would be a reasonable choice for this. Am I correct ? If not, what would be the proper way ?

    



    If so, how do I approach this within my C++ code ? As pointed out, the frame is already encoded. Also, there is no sound or other stuff, simply a H.264 encoded frame as byte array that is updated irregularly and should be converted into a steady video stream. I assume that this would be FFmpeg's job and that the subsequent streaming via FFserver would be simple from there. What I don't know is how to feed my data to FFmpeg in the first place, as all FFmpeg tutorials I found (in a non-exhaustive search) work on a file or webcam/capture device as data source, not volatile data in main memory.

    



    The file mentioned above that I am already able to create is a C++ file stream to which I append each single frame, meaning that different framerates of video and rendering are not treated correctly. This also needs to be taken care of at some point.

    



    Can somebody point me in the right direction ? Can I forward data from my application to FFmpeg to build a proper video feed without writing to the hard disk ? Tutorials are greatly appreciated. By the way FFmpeg/FFserver is not mandatory. If you have a better idea for streaming of OpenGL framebuffer contents, I'm eager to know.

    


  • Stream OpenGL framebuffer over HTTP (via FFmpeg)

    17 juin 2016, par mOfl

    I have an OpenGL application of which rendered images need to be streamed over internet to mobile clients. Previously, it sufficed to simply record the rendering into a video file, which is already working, and now this should be extended to subsequent streaming.

    What is working right now :

    • Render a scene to an OpenGL framebuffer object
    • Capture the FBO content using NvIFR
    • Encode it to H.264 using NvENC (no CPU round trip required)
    • Download the encoded frame to host memory as a byte array
    • Append this frame to a video file

    None of this steps involves FFmpeg or any other library so far. I now want to replace the last step with "Stream the current frame’s byte array over internet" and I assume that using FFmpeg and FFserver would be a reasonable choice for this. Am I correct ? If not, what would be the proper way ?

    If so, how do I approach this within my C++ code ? As pointed out, the frame is already encoded. Also, there is no sound or other stuff, simply a H.264 encoded frame as byte array that is updated irregularly and should be converted into a steady video stream. I assume that this would be FFmpeg’s job and that the subsequent streaming via FFserver would be simple from there. What I don’t know is how to feed my data to FFmpeg in the first place, as all FFmpeg tutorials I found (in a non-exhaustive search) work on a file or webcam/capture device as data source, not volatile data in main memory.

    The file mentioned above that I am already able to create is a C++ file stream to which I append each single frame, meaning that different framerates of video and rendering are not treated correctly. This also needs to be taken care of at some point.

    Can somebody point me in the right direction ? Can I forward data from my application to FFmpeg to build a proper video feed without writing to the hard disk ? Tutorials are greatly appreciated. By the way FFmpeg/FFserver is not mandatory. If you have a better idea for streaming of OpenGL framebuffer contents, I’m eager to know.