Recherche avancée

Médias (91)

Autres articles (81)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (18146)

  • How can I encode a real-time video from dynamically spaced frames ?

    7 mai 2019, par Ujuf66

    I’m trying to create a video from a series of screenshots. The screenshots are in a database and have dynamic FPS (1-3 FPS). How can I create a video file with constant FPS ?

    Before performing av_packet_rescale_ts I tried to change the st^.codec.time_base.den value on the fly between 1 and 3.

    This is the basic cycle of encoding of one picture :

    repeat
     fillchar(pkt, sizeof(TAVPacket), #0);
     av_init_packet(@pkt);

     (* encode the image *)
     ret := avcodec_encode_video2(st^.codec, @pkt, frame, got_packet);
     if (ret < 0) then
     begin
       writeln(format('Error encoding video frame: %s', [av_err2str(ret)]));
       exit;
     end;

     if (got_packet > 0) then
     begin
       (* rescale output packet timestamp values from codec to stream timebase *)
       av_packet_rescale_ts(@pkt, st^.codec.time_base, st^.time_base);
       pkt.stream_index := st^.index;

       log_packet(oc, @pkt);
       (* Write the compressed frame to the media file. *)
       av_interleaved_write_frame(oc, @pkt);
     end;
     inc(frame.pts);

    until (av_compare_ts(frame.pts, st^.codec^.time_base, 1, av_make_q(1, 1)) >= 0);

    Changing the FPS on the fly causes the video output to fail. If I don’t change the st^.codec.time_base.den value the video speeds up and slows down.

  • libav sws_scale() fails colorspace conversion on real device, works on emulator

    26 août 2020, par chugadie

    I'm making a movie player with libav. I have decoding video packets working, I have play in reverse working, I have seeking working. All this works no an x86 android emulator, but fails to work on a real android phone (arm64-v8a)

    


    The failure is in sws_scale() - it returns 0. The video frames continue to be decoded properly with no errors.

    


    There are no errors, warnings, alerts from libav. I have connected an avlog_callback

    


    void log_callback(void *ptr, int level, const char *fmt, va_list vargs) {
    if (level<= AV_LOG_WARNING)
        __android_log_print( level, LOG_TAG, fmt, vargs);
}
uint64_t openMovie( char* path, int rotate, float javaDuration )
{
    av_log_set_level(AV_LOG_WARNING);
    av_log_set_callback(log_callback);


    


    The code to do the sws_scale() is :

    


    int JVM_getBitmapBuffer( JNIEnv* env, jobject thiz, jlong av, jobject bufferAsInt, jbyte transparent ) { 
    avblock *block = (avblock *) av;
    if (!block) {
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  avblock is null");
        return AVERROR(EINVAL);
    }
    if (!block->pCodecCtx) {
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  codecctx is null");
        return AVERROR(EINVAL);
    }

    int width = block->pCodecCtx->width;
    int height = block->pCodecCtx->height;

    if (NULL == block->sws) {
        __android_log_print( ANDROID_LOG_ERROR, LOG_TAG, "getBitmapBuffer:\n  *** invalid sws context ***" );
    }

    int scaleRet = sws_scale( block->sws,
            block->pFrame->data,
            block->pFrame->linesize,
            0,
            height,
            block->pFrameRGB->data,
            block->pFrameRGB->linesize
    );
    if (scaleRet == 0 ) {
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  scale failed");
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  pframe linesize    %d", block->pFrame->linesize[0]); 
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  pframergb linesize %d", block->pFrameRGB->linesize[0]); 
        __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, "  height  %d",
        height);
        return AVERROR(EINVAL);
    }


    


    Setting up the codex and avframes :

    


    //i have tried every combination of 1, 8, 16, and 32 for these values
int alignRGB = 32;
int align    = 16; 
int width    = block->pCodecCtx->width;
int height   = block->pCodecCtx->height;
block->pFrame    = av_frame_alloc();
block->pFrameRGB = av_frame_alloc();

block->pFrameRGBBuffer = av_malloc(
    (size_t)av_image_get_buffer_size(AV_PIX_FMT_RGB32, width, height, alignRGB) 
);

av_image_fill_arrays(
    block->pFrameRGB->data,
    block->pFrameRGB->linesize,
    block->pFrameRGBBuffer,
    AV_PIX_FMT_RGB32,
    width,
    height,
    alignRGB
);

block->pFrameBuffer = av_malloc(
        (size_t) av_image_get_buffer_size(block->pCodecCtx->pix_fmt,
                                          width, height, align
        )
);
av_image_fill_arrays(
    block->pFrame->data,
    block->pFrame->linesize,
    block->pFrameBuffer,
    block->pCodecCtx->pix_fmt,
    width, height,
    align
);
block->sws = sws_getContext(
    width, height,
    AV_PIX_FMT_YUV420P,
    width, height,
    AV_PIX_FMT_RGB32,
    SWS_BILINEAR, NULL, NULL, 0
);


    


    Wildcards are that :

    


      

    • I'm using React-Native
    • 


    • My emulator is x86 android api 28
    • 


    • My real-device is arm64-v8a AOSP (around api 28, don't remember exactly(
    • 


    


    Other notes :

    


      

    • libav .so files are compiled from mobile-ffmpeg project.
    • 


    • I can also sws_scale also works on x86_64 linux using SDL to project YV12
    • 


    • Test video is here : https://github.com/markkimsal/video-thumbnailer/tree/master/fixtures
    • 


    • block is a simple C struct with pointers to relevant AV memory structures.
    • 


    • Using FFMPEG 4.3.2
    • 


    


    I'm pretty certain it has something to do with the pixel alignment. But documentation is practically non-existent on this topic. It could also be the difference between pixel formats RGBA and RGB32, or possibly little-endian vs big-endian.

    


  • Can ffmpeg periodically report statistics on a real-time audio stream (rather than file) ?

    19 janvier 2016, par Caius Jard

    I currently use ffmpeg to capture desktop screen and audio that the computer speakers are playing, something like a screencast. ffmpeg is started by an app that captures its console output, so I can have that app read the output and look for info

    I’d like to know if there are a set of switches I can supply to ffmpeg whereby it will periodically output some audio statistics that will directly report, or allow me to infer, that the audio stream has gone silent ?

    I see some audio statistics switches/filters but the help docs for these seem to imply they will collect their stats over the processing of an entire stream and then report them at the end.. I’d prefer something like "the average audio volume over the past 5 seconds" reported every 5 seconds. I could even deduce from the audio bitrate of the encoder I think, if it’s VBR and the rate consistently falls because it’s encoding nothing