Recherche avancée

Médias (3)

Mot : - Tags -/pdf

Autres articles (44)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (5288)

  • How can I have ffmpeg receive both video and audio over RTP ?

    23 mai 2018, par KallDrexx

    I am trying to instruct FFMPEG to receive h264 video and aac audio via RTP using out of band session initialization.

    To do that I have the following local SDP :

    v=0
    o=sb
    s=-
    t=0 0
    c=IN IP4 127.0.0.1
    m=video 12100 RTP/AVP 96
    a=rtpmap:96 H264/90000
    m=audio 12101 RTP/AVP 97
    a=rtpmap:97 MPEG4-GENERIC/44100/2

    When I load ffmpeg with :

    ffmpeg -loglevel debug -protocol_whitelist "file,rtp,udp" -i .\test.sdp -strict -2 test.flv

    I get the following error :

    [udp @ 0000022d7fdafe80] bind failed: Error number -10048 occurred
    [AVIOContext @ 0000022d7fd84900] Statistics: 154 bytes read, 0 seeks
    .\test.sdp: Invalid data found when processing input

    Confused by that error code I loaded it up on a Linux VM and the bind error I got was Address already in use.

    I tried changing both of those port numbers all around and kept getting that error. Finally I removed one of the media streams from the SDP so it ONLY had video or ONLY had audio and no binding error occurred.

    How can I get ffmpeg to bind to multiple RTP ports for RTP ingestion ?

  • ffmpeg : remove dead call to av_parser_change()

    21 mars 2018, par James Almer
    ffmpeg : remove dead call to av_parser_change()
    

    It's been a noop for years, and it's been argued that in-band headers
    should not be forcedly removed without the user's explicit request.

    Also, as the FIXME line stated, this is a job for a bitstream filter
    like extract_extradata, remove_extradata, dump_extradata, and
    filter_units.

    Signed-off-by : James Almer <jamrial@gmail.com>

    • [DH] fftools/ffmpeg.c
    • [DH] fftools/ffmpeg.h
  • Using FFMPEG to make HLS clips from H264

    21 novembre 2017, par Tyler Brooks

    I am using a Hi35xx camera processor from HiSilicon. It is an Arm9 with a video pipeline bolted on the side. At one end of the pipeline is the CMOS sensor. At the other end is a H264 encoder. When I turn on the pipeline, the encoder outputs H264 NAL packets like this :

    frame0: <sps>,<pps>,<sei>,<key frame="frame">
    frame1: <delta frame="frame">
    frame2: <delta frame="frame">
    ...
    frameN: <delta frame="frame">
    frameN+1: <sps>,<pps>,<sei><key frame="frame">
    frameN+2: <delta frame="frame">
    frameN+3: <delta frame="frame">
    ...
    etc.
    </delta></delta></key></sei></pps></sps></delta></delta></delta></key></sei></pps></sps>

    I am turning that into HLS clips by doing the following (pseudo code for clarity) :

    av_register_all();
    avformat_network_init();

    avformat_alloc_output_context2(&amp;ctx_out, NULL, "hls", "./foo.m3u8");

    strm_out = avformat_new_stream(ctx_out, NULL);

    codec_out = strm_out->codecpar;
    codec_out->codec_id = AV_CODEC_ID_H264;
    codec_out->codec_type = AVMEDIA_TYPE_VIDEO;
    codec_out->width = encoder_width;
    codec_out->height = encoder_height;
    codec_out->bit_rate = encoder_bitrate;
    codec_out->codec_tag = 0;

    avformat_write_header(ctx_out, NULL);

    while(get_packet_from_pipeline_encoder(&amp;encoder_packet)) {
     AVPacket pkt;
     av_init_packet(&amp;pkt);
     pkt.stream_index = 0;

     pkt.dts = AV_NOPTS_VALUE;
     pkt.pts = AV_NOPTS_VALUE;
     pkt.duration = (1000000/FRAMERATE);    // frame rate in microseconds

     pkt.data = encoder_packet.data;
     pkt.size = encoder_packet.size;

     if (is_keyframe(&amp;encoder_packet)) {
       pkt.flags |= AV_PKT_FLAG_KEY;
     }

     av_write_frame(ctx_out, &amp;pkt);
    }

    av_write_trailer(ctx_out);
    avformat_free_context(ctx_out);

    This seems to work fine except that the resulting HLS frame rate is not right. Of course, this happens because I am not setting the pts/dts stuff correctly and ffmpeg lets me know that. So I have two quetions :

    1. Am I going about this right ?
    2. How can I set the pts/dts stuff correctly ?

    The encoder is giving me packets and I am submitting them as frames. Those <sps>, <pps> and <sei></sei></pps></sps> packets are really out of band data and don’t really have a timestamp. How can I submit them correctly ?