Recherche avancée

Médias (1)

Mot : - Tags -/copyleft

Autres articles (73)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Installation en mode standalone

    4 février 2011, par

    L’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
    [mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
    Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (10065)

  • ffmpeg : LIVE HLS Playback getting delayed over time

    31 juillet 2020, par Krishnakumar

    We are converting UDP Input to Live HLS output and pushing the chunks to a webdav server. Initial playback works fine, But the playback was delayed by almost an hour, when we checked after 15 hours. Not sure this is caused due to network as we use internet to push the chunks. Below is the ffmpeg command used.

    


    ffmpeg -i udp://230.1.1.15:10000?fifo_size=10000000&overrun_nonfatal=1 -filter_complex [i:0xd49]yadif,setdar=256/144[v0];[i:0xd49]yadif,setdar=512/288[v1];[i:0xd49]yadif,setdar=640/360[v2];[i:0xd49]yadif,setdar=1280/720[v3] -f hls -map i:0xd4a -af aresample=async=1:min_hard_comp=0.100000:first_pts=0 -c:a:0 libfdk_aac -b:a:0 48000 -ar:0 48000 -map [v0] -b:v:0 100000 -maxrate:v:0 100000 -minrate:v:0 100000 -bufsize:v:0 200000 -s:v:0 256x144 -r:0 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:0 libx264 -profile:v:0 baseline -level:v:0 3.0 -preset:v:0 fast -nal-hrd cbr -g 50 -map i:0xd4a -af aresample=async=1:min_hard_comp=0.100000:first_pts=0 -c:a:1 libfdk_aac -b:a:1 48000 -ar:1 48000 -map [v1] -b:v:1 200000 -maxrate:v:1 200000 -minrate:v:1 200000 -bufsize:v:1 400000 -s:v:1 512x288 -r:1 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:1 libx264 -profile:v:1 baseline -level:v:1 3.0 -preset:v:1 fast -nal-hrd cbr -g 50 -map i:0xd4a -af aresample=async=1:min_hard_comp=0.100000:first_pts=0 -c:a:2 libfdk_aac -b:a:2 64000 -ar:2 48000 -map [v2] -b:v:2 700000 -maxrate:v:2 700000 -minrate:v:2 700000 -bufsize:v:2 1400000 -s:v:2 640x360 -r:2 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:2 libx264 -profile:v:2 baseline -level:v:2 3.0 -preset:v:2 fast -nal-hrd cbr -g 50 -map i:0xd4a -af aresample=async=1:min_hard_comp=0.100000:first_pts=0 -c:a:3 libfdk_aac -b:a:3 64000 -ar:3 48000 -map [v3] -b:v:3 1000000 -maxrate:v:3 1000000 -minrate:v:3 1000000 -bufsize:v:3 2000000 -s:v:3 1280x720 -r:3 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:3 libx264 -profile:v:3 high -level:v:3 4.0 -preset:v:3 fast -nal-hrd cbr -g 50 -var_stream_map a:0,v:0,name:148k a:1,v:1,name:248k a:2,v:2,name:764k a:3,v:3,name:1064k -master_pl_name master.m3u8 -hls_list_size 3 -hls_time 6 -hls_segment_filename https://usr:pass@example.com:8043/httppush/media_%v_%03d.ts -hls_flags delete_segments+independent_segments+discont_start https://usr:pass@example.com:8043/httppush/playlist_%v.m3u8


    


    FFMPEG V4.3

    


    ffmpeg version 4.3 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-36)
  configuration: --prefix=/root/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags=-L/root/ffmpeg_build/lib --extra-libs=-lpthread --extra-libs=-lm --bindir=/root/bin --enable-gpl --enable-libfdk_aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree --enable-openssl
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100


    


    With Split Filter :

    


    ffmpeg -i "udp://230.1.1.15:10000?fifo_size=10000000&overrun_nonfatal=1" -filter_complex "[i:0xd49]yadif[v];[v]split=4[1out1][1out2][1out3][1out4];[1out1]setdar=256/144[v0];[1out2]setdar=512/288[v1];[1out3]setdar=640/360[v2];[1out4]setdar=1280/720[v3];[i:0xd4a]aresample=async=1:min_hard_comp=0.100000:first_pts=0[a];[a]asplit=4[a0][a1][a2][a3]" -f hls -map [a0] -c:a:0 libfdk_aac -b:a:0 48000 -ar:0 48000 -map [v0] -b:v:0 100000 -maxrate:v:0 100000 -minrate:v:0 100000 -bufsize:v:0 200000 -s:v:0 256x144 -r:0 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:0 libx264 -profile:v:0 baseline -level:v:0 3.0 -preset:v:0 fast -nal-hrd cbr -g 50 -map [a1] -c:a:1 libfdk_aac -b:a:1 48000 -ar:1 48000 -map [v1] -b:v:1 200000 -maxrate:v:1 200000 -minrate:v:1 200000 -bufsize:v:1 400000 -s:v:1 512x288 -r:1 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:1 libx264 -profile:v:1 baseline -level:v:1 3.0 -preset:v:1 fast -nal-hrd cbr -g 50 -map [a2] -c:a:2 libfdk_aac -b:a:2 64000 -ar:2 48000 -map [v2] -b:v:2 700000 -maxrate:v:2 700000 -minrate:v:2 700000 -bufsize:v:2 1400000 -s:v:2 640x360 -r:2 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:2 libx264 -profile:v:2 baseline -level:v:2 3.0 -preset:v:2 fast -nal-hrd cbr -g 50 -map [a3] -c:a:3 libfdk_aac -b:a:3 64000 -ar:3 48000 -map [v3] -b:v:3 1000000 -maxrate:v:3 1000000 -minrate:v:3 1000000 -bufsize:v:3 2000000 -s:v:3 1280x720 -r:3 25 -sc_threshold 0 -pix_fmt yuv420p -flags +global_header+cgop -c:v:3 libx264 -profile:v:3 high -level:v:3 4.0 -preset:v:3 fast -nal-hrd cbr -g 50 -var_stream_map "a:0,v:0,name:148k a:1,v:1,name:248k a:2,v:2,name:764k a:3,v:3,name:1064k" -master_pl_name master.m3u8 -hls_list_size 3 -hls_time 6 -hls_segment_filename https://usr:pass@example.com:8043/httppush/media_%v_%03d.ts -hls_flags delete_segments+independent_segments+discont_start https://usr:pass@example.com:8043/httppush/playlist_%v.m3u8


    


  • Encoding to h264 failed to send some frames using ffmpeg c api

    8 juillet 2020, par Vuwox

    Using FFMPEG C API, Im trying to push generated image to MP4 format.

    


    When I push frame-by-frame, the muxing seems to failed on avcodec_receive_packet(...) which return AVERROR(EAGAIN) on the first frames, but after a while is starting to add my frame, but the first one.

    


    What I mean, is that when push frame 1 to 13, I have errors, but after frame 14 to end (36), the frame are added to the video, but the encoded image are not the 14 to 36, instead its the frame 1 to 23 that are added.

    


    I don't understand, is this a problem with the framerate (which i want 12 fps), or with key/inter- frame ?

    


    Here the code for different part of the class,

    


    NOTE :

    


      

    • m_filename = "C :\tmp\test.mp4"
    • 


    • m_framerate = 12
    • 


    • m_width = 1080
    • 


    • m_height = 1080
    • 


    


    ctor

    


    // Allocate the temporary buffer that hold the our generated image in RGB.
picture_rgb24 = av_frame_alloc();
picture_rgb24->pts = 0;
picture_rgb24->data[0] = NULL;
picture_rgb24->linesize[0] = -1;
picture_rgb24->format = AV_PIX_FMT_RGB24;
picture_rgb24->height = m_height;
picture_rgb24->width = m_width;

if ((_ret = av_image_alloc(picture_rgb24->data, picture_rgb24->linesize, m_width, m_height, (AVPixelFormat)picture_rgb24->format, 24)) < 0)
    throw ...

// Allocate the temporary frame that will be convert from RGB to YUV using ffmpeg api.
frame_yuv420 = av_frame_alloc();
frame_yuv420->pts = 0;
frame_yuv420->data[0] = NULL;
frame_yuv420->linesize[0] = -1;
frame_yuv420->format = AV_PIX_FMT_YUV420P;
frame_yuv420->width = m_height;
frame_yuv420->height = m_width;

if ((_ret = av_image_alloc(frame_yuv420->data, frame_yuv420->linesize, m_width, m_height, (AVPixelFormat)frame_yuv420->format, 32)) < 0)
    throw ...

init_muxer(); // see below.

m_inited = true;
    
m_pts_increment = av_rescale_q(1, { 1, m_framerate }, ofmt_ctx->streams[0]->time_base);

// Context that convert the RGB24 to YUV420P format (using this instead of filter similar to GIF).
swsCtx = sws_getContext(m_width, m_height, AV_PIX_FMT_RGB24, m_width, m_height, AV_PIX_FMT_YUV420P, SWS_BICUBIC, 0, 0, 0);


    


    init_muxer :

    


    AVOutputFormat* oformat = av_guess_format(nullptr, m_filename.c_str(), nullptr);
if (!oformat) throw ...

_ret = avformat_alloc_output_context2(&ofmt_ctx, oformat, nullptr, m_filename.c_str());
if (_ret) throw ...

AVCodec *codec = avcodec_find_encoder(oformat->video_codec);
if (!codec) throw ...

AVStream *stream = avformat_new_stream(ofmt_ctx, codec);
if (!stream) throw ...

o_codec_ctx = avcodec_alloc_context3(codec);
if (!o_codec_ctx) throw ...

stream->codecpar->codec_id = oformat->video_codec;
stream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
stream->codecpar->width = m_width;
stream->codecpar->height = m_height;
stream->codecpar->format = AV_PIX_FMT_YUV420P;
stream->codecpar->bit_rate = 400000;

avcodec_parameters_to_context(o_codec_ctx, stream->codecpar);
o_codec_ctx->time_base = { 1, m_framerate };

// Using gop_size == 0, we want 'intra' frame, so no b-frame will be generated.
o_codec_ctx->max_b_frames = 0;
o_codec_ctx->gop_size = 0;
o_codec_ctx->b_quant_offset = 0;
//o_codec_ctx->framerate = { m_framerate , 1 };

if (stream->codecpar->codec_id == AV_CODEC_ID_H264)
    av_opt_set(o_codec_ctx, "preset", "ultrafast", 0);      // Lossless H.264
else if (stream->codecpar->codec_id == AV_CODEC_ID_H265)
    av_opt_set(o_codec_ctx, "preset", "ultrafast", 0);      // Lossless H.265

avcodec_parameters_from_context(stream->codecpar, o_codec_ctx);

if ((_ret = avcodec_open2(o_codec_ctx, codec, NULL)) < 0)
    throw ...

if ((_ret = avio_open(&ofmt_ctx->pb, m_filename.c_str(), AVIO_FLAG_WRITE)) < 0)
    throw ...

if ((_ret = avformat_write_header(ofmt_ctx, NULL)) < 0)
    throw ...

av_dump_format(ofmt_ctx, 0, m_filename.c_str(), 1);


    


    add_frame :

    


    // loop to transfer our image format to ffmpeg one.
for (int y = 0; y < m_height; y++)
{
    for (int x = 0; x < m_width; x++)
    {
        picture_rgb24->data[0][idx] = ...;
        picture_rgb24->data[0][idx + 1] = ...;
        picture_rgb24->data[0][idx + 2] = ...;
    }
}

// From RGB to YUV
sws_scale(swsCtx, (const uint8_t * const *)picture_rgb24->data, picture_rgb24->linesize, 0, m_height, frame_yuv420->data, frame_yuv420->linesize);

// mux the YUV frame
muxing_one_frame(frame_yuv420);

// Increment the FPS of the picture for the next add-up to the buffer.      
picture_rgb24->pts += m_pts_increment;
frame_yuv420->pts += m_pts_increment;


    


    muxing_one_frame :

    


    int ret = avcodec_send_frame(o_codec_ctx, frame);
AVPacket *pkt = av_packet_alloc();
av_init_packet(pkt);

while (ret >= 0) {
    ret = avcodec_receive_packet(o_codec_ctx, pkt);
    if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) break;        
    av_write_frame(ofmt_ctx, pkt);
}
av_packet_unref(pkt);


    


    close_file :

    


    av_write_trailer(ofmt_ctx);
avio_close(ofmt_ctx->pb);


    


  • Transcode HLS segments individually with FFMPEG

    9 juillet 2020, par Mathix420

    I'm trying to transcode a video in HLS, by first splitting the video in segments without encoding changes and then transcode all segments individually. I'm trying to achieve this so I can transcode a video in multiple EC2 instances in parallel to be more time efficient.

    


    I am using this scipt right now

    


    # Split input file in multiple segments

ffmpeg -hide_banner -y -i $input -c copy -map 0 -an -segment_time 4 -reset_timestamps 1 -f segment output%03d.webm

# Transcode each segments in multiple resolutions

find . -name 'output*.webm' -exec ffmpeg -hide_banner -y -i {} \
  -vf "scale=-2:360" -c:v libx264 -profile:v main -crf 20 -sc_threshold 0 -b:v 800k -maxrate 856k -bufsize 1200k {}.360p.ts \
  -vf "scale=-2:480" -c:v libx264 -profile:v main -crf 20 -sc_threshold 0 -b:v 1400k -maxrate 1498k -bufsize 2100k {}.480p.ts \
  -vf "scale=-2:720" -c:v libx264 -profile:v main -crf 20 -sc_threshold 0 -b:v 2800k -maxrate 2996k -bufsize 4200k {}.720p.ts \
  -vf "scale=-2:1080" -c:v libx264 -profile:v main -crf 20 -sc_threshold 0 -b:v 5000k -maxrate 5350k -bufsize 7500k {}.1080p.ts \;


    


    But then when I tried to get all segments durations to make an m3u8 playlist (with the command below)

    


    # List segments duration

find . -name 'output*.webm.360p.ts' \
  -exec echo -n {} \; \
  -exec ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 {} \;


    


    I got this result

    


    output000.webm.360p.ts 5.120000
output001.webm.360p.ts 5.120000
output002.webm.360p.ts 4.400000
output003.webm.360p.ts 5.480000
output004.webm.360p.ts 0.360000
output005.webm.360p.ts 5.120000
output006.webm.360p.ts 4.960000
output007.webm.360p.ts 0.001000


    


    I can't figure out why my output004 is only 0.360000 seconds long.

    


    When I tried to play it with VLC it just shows one or two frames then main decoder error: buffer deadlock prevented.

    


    Thanks for trying to help me !