Recherche avancée

Médias (91)

Autres articles (107)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (13435)

  • avformat/hls : Factor playlist need check to a common function

    19 novembre 2017, par Anssi Hannula
    avformat/hls : Factor playlist need check to a common function
    
    • [DH] libavformat/hls.c
  • FFmpeg Hardware Acceleration with NVENC produces Half Green output video

    17 juin 2016, par Dan Sandland

    Using the FFmpeg build found here : https://github.com/illuspas/ffmpeg-hw-win32

    gcc 5.3.0
    --enable-nvenc nvidia_video_sdk_6.0.1
    --enable-libmfx Intel(R)_Media_SDK_2016.0.1
    --enable-libfdk-aac 0.1.4
    --enable-libspeex 1.2rc1
    --enable-libx264 1:148.20150725
    --enable-libopenh264 1.5.0
    --enable-libx265 1.8
    --enable-libopus 1.1.2
    --enable-libmp3lame 3.99.5
    --enable-libkvazaar 0.8.2

    ./configure —prefix=/home/aliang/FFmpeg/x86_64 —enable-small —disable-debug —disable-doc —arch=x86_64 —cc=’ccache x86_64-w64-mingw32-gcc’ —cross-prefix=x86_64-w64-mingw32- —enable-cross-compile —target-os=mingw32 —enable-libfdk-aac —enable-libmp3lame —enable-libopus —enable-libspeex —enable-libx264 —enable-libx265 —enable-libmfx —enable-nvenc —enable-libopenh264 —enable-libkvazaar —enable-gpl —enable-nonfree

    I’m running Windows on a MacBook Pro. I also tried with a more recent build and had the same output.

    Input video is from sample-videos.com.

    The ffmpeg command I am running is :

    ffmpeg -y -i sample.mp4 -vcodec nvenc_h264 -pixel_format yuv420p -f mp4 sample-out-nvenc.mp4

    sample-out-nvenc.mp4 looks like this via ffplay or vlc :

    enter image description here

    When I grab a frame using jpeg2, the colors appear normal, but the height is squished.

    ffmpeg -y -ss 15.5 -i sample.mp4 -vframes 1 -s 480x300 -f image2 grab.jpg

    enter image description here

    The ffprobe results for the output (sample-out-nvenc.mp4) :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample-out-nvenc.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.25.100
     Duration: 00:00:31.02, start: 0.021333, bitrate: 1994 kb/s
       Stream #0:0(und): Video: h264 (avc1 / 0x31637661), yuv420p(tv), 640x480 [SAR 1:1 DAR 4:3], 1650 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 48000 Hz, 5.1, fltp, 342 kb/s (default)
       Metadata:
         handler_name    : SoundHandler

    Lastly the output from the nvenc encoding command :

    ffmpeg -y -i sample.mp4 -vcodec nvenc_h264 -pixel_format yuv420p -f mp4 sample-out-nvenc.mp4
    ffmpeg version 3.0 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 5.3.0 (GCC)
     configuration: --prefix=/home/aliang/FFmpeg/x86_64 --enable-small --disable-debug --disable-doc --arch=x86_64 --cc='ccache x86_64-w64-mingw32-gcc' --cross-prefix=x86_64-w64-mingw32- --enable-cross-compile --target-os=mingw32 --enable-libfdk-aac --enable-libmp3lame --enable-libopus --enable-libspeex --enable-libx264 --enable-libx265 --enable-libmfx --enable-nvenc --enable-libopenh264 --enable-libkvazaar --enable-gpl --enable-nonfree
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       creation_time   : 1970-01-01 00:00:00
       encoder         : Lavf53.24.2
     Duration: 00:00:31.00, start: 0.000000, bitrate: 1353 kb/s
       Stream #0:0(und): Video: h264 (avc1 / 0x31637661), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 966 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 48000 Hz, 5.1, fltp, 383 kb/s (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : SoundHandler
    Output #0, mp4, to 'sample-out-nvenc.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.25.100
       Stream #0:0(und): Video: h264 (nvenc_h264) ([33][0][0][0] / 0x0021), yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=-1--1, 2000 kb/s, 25 fps, 12800 tbn, 25 tbc (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : VideoHandler
         encoder         : Lavc57.24.102 nvenc_h264
       Side data:
         unknown side data type 10 (24 bytes)
       Stream #0:1(und): Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, 5.1, fltp, 341 kb/s (default)
       Metadata:
         creation_time   : 1970-01-01 00:00:00
         handler_name    : SoundHandler
         encoder         : Lavc57.24.102 aac
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (nvenc_h264))
     Stream #0:1 -> #0:1 (aac (native) -> aac (native))
    Press [q] to stop, [?] for help
    frame=  774 fps=253 q=-0.0 Lsize=    7551kB time=00:00:30.99 bitrate=1995.6kbits/s speed=10.1x
    video:6236kB audio:1297kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.243011%
    [aac @ 000001cdd9900520] Qavg: 743.457
  • Rotating a video during encoding with ffmpeg and libav API results in half of video corrupted

    11 mai 2020, par Daniel Kobe

    I'm using the C API for ffmpeg/libav to rotate a vertically filmed iphone video during the encoding step. There are other questions asking to do a similar thing but they are all using the CLI tool to do so.

    



    So far I was able to figure out how to use the AVFilter to rotate the video, base off this example https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/filtering_video.c

    



    The problem is that half the output file is corrupt.
Corrupt Video Screenshot

    



    Here is the code for my encoding logic. Its written with GOLANG using CGO to interface with the C API.

    



    // Encode encode an AVFrame and return it
func Encode(enc Encoder, frame *C.AVFrame) (*EncodedFrame, error) {
    ctx := enc.Context()

    if ctx.buffersrcctx == nil {
        // initialize filter
        outputs := C.avfilter_inout_alloc()
        inputs  := C.avfilter_inout_alloc()
        m_pFilterGraph := C.avfilter_graph_alloc()
        buffersrc := C.avfilter_get_by_name(C.CString("buffer"))
        argsStr := fmt.Sprintf("video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d", ctx.avctx.width, ctx.avctx.height, ctx.avctx.pix_fmt, ctx.avctx.time_base.num, ctx.avctx.time_base.den, ctx.avctx.sample_aspect_ratio.num, ctx.avctx.sample_aspect_ratio.den)
        Log.Info.Println("yakotest")
        Log.Info.Println(argsStr)
        args := C.CString(argsStr)
        ret := C.avfilter_graph_create_filter(&ctx.buffersrcctx, buffersrc, C.CString("my_buffersrc"), args, nil, m_pFilterGraph)
        if ret < 0 {
            Log.Info.Printf("\n problem creating filter %v\n", AVError(ret).Error())
        }

        buffersink := C.avfilter_get_by_name(C.CString("buffersink"))
        ret = C.avfilter_graph_create_filter(&ctx.buffersinkctx, buffersink, C.CString("my_buffersink"), nil, nil, m_pFilterGraph)
        if ret < 0 {
            Log.Info.Printf("\n problem creating filter %v\n", AVError(ret).Error())
        }

        /*
         * Set the endpoints for the filter graph. The filter_graph will
         * be linked to the graph described by filters_descr.
         */

        /*
         * The buffer source output must be connected to the input pad of
         * the first filter described by filters_descr; since the first
         * filter input label is not specified, it is set to "in" by
         * default.
         */
        outputs.name       = C.av_strdup(C.CString("in"))
        outputs.filter_ctx = ctx.buffersrcctx
        outputs.pad_idx    = 0
        outputs.next       = nil

        /*
         * The buffer sink input must be connected to the output pad of
         * the last filter described by filters_descr; since the last
         * filter output label is not specified, it is set to "out" by
         * default.
         */
        inputs.name       = C.av_strdup(C.CString("out"))
        inputs.filter_ctx = ctx.buffersinkctx
        inputs.pad_idx    = 0
        inputs.next       = nil

        ret = C.avfilter_graph_parse_ptr(m_pFilterGraph, C.CString("transpose=clock,scale=-2:1080"),
            &inputs, &outputs, nil)
        if ret < 0 {
            Log.Info.Printf("\n problem with avfilter_graph_parse %v\n", AVError(ret).Error())
        }

        ret = C.avfilter_graph_config(m_pFilterGraph, nil)
        if ret < 0 {
            Log.Info.Printf("\n problem with graph config %v\n", AVError(ret).Error())
        }
    }

    filteredFrame :=  C.av_frame_alloc()

    /* push the decoded frame into the filtergraph */
    ret := C.av_buffersrc_add_frame_flags(ctx.buffersrcctx, frame, C.AV_BUFFERSRC_FLAG_KEEP_REF)
    if ret < 0 {
        Log.Error.Printf("\nError while feeding the filter greaph, err = %v\n", AVError(ret).Error())
        return nil, errors.New(ErrorFFmpegCodecFailure)
    }

    /* pull filtered frames from the filtergraph */
    for {
        ret = C.av_buffersink_get_frame(ctx.buffersinkctx, filteredFrame)
        if ret == C.AVERROR_EAGAIN || ret == C.AVERROR_EOF {
            break
        }
        if ret < 0 {
            Log.Error.Printf("\nCouldnt find a frame, err = %v\n", AVError(ret).Error())
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }

        filteredFrame.pts = frame.pts
        frame = filteredFrame
        defer C.av_frame_free(&filteredFrame)
    }

    if frame != nil {
        frame.pict_type = 0 // reset pict type for the encoder
        if C.avcodec_send_frame(ctx.avctx, frame) != 0 {
            Log.Error.Printf("%+v\n", StackErrorf("codec error, could not send frame"))
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }
    }

    for {
        ret := C.avcodec_receive_packet(ctx.avctx, ctx.pkt)
        if ret == C.AVERROR_EAGAIN {
            break
        }
        if ret == C.AVERROR_EOF {
            return nil, fmt.Errorf("EOF")
        }
        if ret < 0 {
            Log.Error.Printf("%+v\n", StackErrorf("codec error, receiving packet"))
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }

        data := C.GoBytes(unsafe.Pointer(ctx.pkt.data), ctx.pkt.size)
        return &EncodedFrame{data, int64(ctx.pkt.pts), int64(ctx.pkt.dts),
            (ctx.pkt.flags & C.AV_PKT_FLAG_KEY) != 0}, nil
    }

    return nil, nil
}


    



    It seems like I need to do something with the scaling here but I'm struggling to find helpful information online.