Recherche avancée

Médias (29)

Mot : - Tags -/Musique

Autres articles (79)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (12928)

  • Decoding mp4/mkv using FFMPEG fails

    14 octobre 2016, par StarShine

    I’m using a project based on the latest FFmpeg git source tree, and linking to the shared DLL’s published by Zeranoe at https://ffmpeg.zeranoe.com/builds/

    The playback code works and loops. It plays back h265 files (raw), mpeg, avi, and mpg files. However as soon as an mp4 or mkv container is specified as input file, regardless of what’s inside,a lot of errors are dumped from the codec. It doesn’t matter if it’s HEVC or h264.

    [h264 @ 00000000xyz] No start code is found
    [h264 @ 00000000xyz] Error splitting the input into NAL units.

    To make everything really strange, ffplay.exe plays these files just fine.

    I realize that I can probably fix this by converting files into a raw format first, but I would like to be able to read and parse mp4 files a they are. Since I am using the pre-build libs of Zeraneo, my guess would be that something was not enabled during the build, but then I would expect ffplay to fail too. Do I need to set a flag in the format_context or codec_context, or provide some sort of filter identifier ?

    Movies that play fine came from http://bbb3d.renderfarming.net/download.html, http://www.w6rz.net/ and http://www.sample-videos.com/

    These work :

    big_buck_bunny_480p_surround-fix.avi
    bigbuckbunny_480x272.h265

    Being a total noob at ffmpeg, please help me understand what is wrong and how to fix it. If the pre-build libs are the culprit, then the second question is if someone has a convenient cmake setup to build this for windows X64 and x32 debug and release targets.

    Here’s the source for initializing ffmpeg for reading

    avdevice_register_all();
    avfilter_register_all();
    av_register_all();
    avformat_network_init();

    The format is parsed as follows :

    m_FormatContext = avformat_alloc_context();
    if (avformat_open_input(&m_FormatContext, file.GetPath().ToString().c_str(), NULL, NULL) != 0)
    {
       //std::cout << "failed to open input" << std::endl;
       success = false;
    }
    // find stream info
    if (success)
    {
       if (avformat_find_stream_info(m_FormatContext, NULL) < 0)
       {
           //std::cout << "failed to get stream info" << std::endl;
           success = false;
       }
    }

    The stream is opened as follows :

    m_VideoStream = avstream;
    m_FormatContext = formatContext;
    if (m_VideoStream)
    {
       m_StreamIndex = m_VideoStream->stream_identifier;
       AVCodecParameters *codecpar = m_VideoStream->codecpar;      
       if (codecpar)
       {
           AVCodecID codec_id = codecpar->codec_id;
           m_Decoder = avcodec_find_decoder(codec_id);
           if (m_Decoder)
           {
               m_CodecContext = avcodec_alloc_context3(m_Decoder);
               if (m_CodecContext)
               {
                   m_CodecContext->width = codecpar->width;                    
                   m_CodecContext->height = codecpar->height;
                   m_VideoSize = i3(codecpar->width, codecpar->height,1);
                   success = 0 == avcodec_open2(m_CodecContext, m_Decoder, NULL);
                   if (success)
                   {
                       if(m_CodecContext)
                       {
                           int size = av_image_get_buffer_size(format, m_CodecContext->width, m_CodecContext->height, 1);      
                           if (size > 0)
                           {
                               av_frame = av_frame_alloc();
                               gl_frame = av_frame_alloc();        
                               uint8_t *internal_buffer = (uint8_t *)av_malloc(size * sizeof(uint8_t));
                               av_image_fill_arrays((uint8_t**)((AVPicture *)gl_frame->data), (int*) ((AVPicture *)gl_frame->linesize), internal_buffer, format, m_CodecContext->width, m_CodecContext->height,1);
                               m_Packet = (AVPacket *)av_malloc(sizeof(AVPacket));
                           }
                       }
                   }
                   if (!success)
                   {
                       avcodec_close(m_CodecContext);
                       avcodec_free_context(&m_CodecContext);
                       m_CodecContext = NULL;
                       m_Decoder = NULL;
                       m_VideoStream = NULL;
                   }
               }
               else
               {
                   m_Decoder = NULL;
                   m_VideoStream = NULL;
               }
           }
       }
    }

    And dedoding on a single thread :

    do
    {
       if (av_read_frame(m_FormatContext, m_Packet) < 0)
       {
           av_packet_unref(m_Packet);
           m_AllPacketsSent = true;
       }
       else
       {
           if (m_Packet->stream_index == m_StreamIndex)
           {                  
               avcodec_send_packet(m_CodecContext, m_Packet);
           }
       }

       int frame_finished = avcodec_receive_frame(m_CodecContext, av_frame);
       if (frame_finished == 0)
       {
           if (!conv_ctx)
           {
               conv_ctx = sws_getContext(m_CodecContext->width,
                   m_CodecContext->height, m_CodecContext->pix_fmt,
                   m_CodecContext->width, m_CodecContext->height, format, SWS_BICUBIC, NULL, NULL, NULL);
           }

           sws_scale(conv_ctx, av_frame->data, av_frame->linesize, 0, m_CodecContext->height, gl_frame->data, gl_frame->linesize);

           switch(format)
           {
               case AV_PIX_FMT_BGR32_1:
               case AV_PIX_FMT_RGB32_1:
               case AV_PIX_FMT_0BGR32:
               case AV_PIX_FMT_0RGB32:
               case AV_PIX_FMT_BGR32:  
               case AV_PIX_FMT_RGB32:              
               {
                   m_CodecContext->bits_per_raw_sample = 32; break;                    
               }
               default:
               {
                   FWASSERT(format == AV_PIX_FMT_RGB32, "The format changed, update the bits per raw sample!"); break;
               }
           }


           size_t bufferSize = m_CodecContext->width * m_CodecContext->height * m_CodecContext->bits_per_raw_sample / 8;
           m_Buffer.Realloc(bufferSize, false,  gl_frame->data[0]);
           m_VideoSize = i3(m_CodecContext->width, m_CodecContext->height,1);
           result = true;
           // sends the image buffer straight to the locked texture here..
           // glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, codec_ctx->width, codec_ctx->height, GL_RGB, GL_UNSIGNED_BYTE, gl_frame->data[0]);
       }

       av_packet_unref(m_Packet);
    } while (m_Packet->stream_index != m_StreamIndex);

    m_FrameDecoded = result;

    Any insight is appreciated !

  • FFmpeg -movflags produces invalid duration in MP4 [moov] box

    31 août 2016, par LYF

    I am trying to make an MP4 file suitable for HTML5 streaming. I am trying to mux an FLV file into a fragmented MP4, with correct metadata in its first moov box.

    I used the command line parameters in this article on the MDN :

    ffmpeg -i h264aac.flv -c copy -movflags frag_keyframe+empty_moov fragmented.mp4

    Then I fed this fragmented.mp4 slowly into an HTML5 SourceBuffer. The video played for one second and stopped. (stopped at its second key frame ?)

    I looked at the MP4 file in Bento’s mp4info tool and found that there is an incorrect duration in the moov box :

    Movie:
    duration:   0 ms
    time scale: 1000
    fragments:  yes

    Then I tried :

    ffmpeg -i h264aac.flv -c copy -movflags frag_keyframe+faststart new_fragmented.mp4

    However, the new_fragmented.mp4 only has 5 seconds. It should have 90 seconds.

    > mp4info new_fragmented.mp4

    Movie:
    duration:   5182 ms
    time scale: 1000
    fragments:  yes

    I also found on stackoverflow a working movflags used in live streaming :

    -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof

    Now the video can be played, but the player does not know the video’s duration until the video is completely downloaded.

    > mp4info stackoverflowSolution.mp4

    duration:   0 ms
    fragments:  yes

    Codecs String: avc1.640028
    Codecs String: mp4a.40.2

    My goal is to make an MP4 file that can be played and has a correct duration when I stream it. I tried switching parameters, adding and removing + signs, but there are too many combinations, and I was not lucky enough to run into a working one by guessing.

    I can generate a fully playable and glitch-less MP4 file using mp4fragment, but I would like to know how to do so in FFmpeg.

  • FFmpeg's -movflags produces invalid duration in [moov] box

    31 août 2016, par LYF

    I am trying to make an MP4 file suitable for HTML5 streaming. I am trying to mux an FLV file into a fragmented MP4, with correct metadata in its first moov box.

    I used the command line parameters in this article on the MDN :

    ffmpeg -i h264aac.flv -c copy -movflags frag_keyframe+empty_moov fragmented.mp4

    Then I fed this fragmented.mp4 slowly into an HTML5 SourceBuffer. The video played for one second and stopped. (stopped at its second key frame ?)

    I looked at the MP4 file in Bento’s mp4info tool and found that there is an incorrect duration in the moov box :

    Movie:
    duration:   0 ms
    time scale: 1000
    fragments:  yes

    Then I tried :

    ffmpeg -i h264aac.flv -c copy -movflags frag_keyframe+faststart new_fragmented.mp4

    However, the new_fragmented.mp4 only has 5 seconds. It should have 90 seconds.

    > mp4info new_fragmented.mp4

    Movie:
    duration:   5182 ms
    time scale: 1000
    fragments:  yes

    I also found on stackoverflow a working movflags used in live streaming :

    -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof

    Now the video can be played, but the player does not know the video’s duration until the video is completely downloaded.

    > mp4info stackoverflowSolution.mp4

    duration:   0 ms
    fragments:  yes

    Codecs String: avc1.640028
    Codecs String: mp4a.40.2

    My goal is make an MP4 file that can be played and has a correct duration when I stream it. I tried switching parameters, adding and removing + signs, but there are too many combinations, and I was not lucky enough to run into a working one by guessing.

    I can generate a fully playable and glitch-less MP4 file using mp4fragment, but I would like to know how to do so in FFmpeg.