Recherche avancée

Médias (1)

Mot : - Tags -/géodiversité

Autres articles (85)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

Sur d’autres sites (9934)

  • ffmpeg rtmp and local file output

    19 mai 2016, par user3922282

    I’m new in stackoverflow =)
    I have a trouble with ffmpeg
    I receive a rtsp stream from a grabbing device (camera) and I stream-out it to rtmp (Youtube Live)
    I want to have a copy of the stream in my computer so I write at the same time in a local file

    I use this command :

    ffmpeg -y -i ’RTSP_SOURCE’ -c:v copy -c:a libvo_aacenc -map 0:v -bsf:v
    dump_extra -fflags +genpts -flags +global_header -movflags +faststart
    -map_metadata 0 -metadata title= -f tee -filter_complex aevalsrc=0 ’[f=mp4]/tmp/backup.mp4|[f=mpegts]/tmp/backup.ts|[f=flv]rtmp ://a.rtmp.youtube.com/live2/STREAM_ID’

    The problem is when I have some disconnections, ffmpeg exits and stop to recording
    Is there any flag or option for telling to ffmpeg to continue recording in local files even there is not internet ?

    Thank you very much for your help =)

  • Is there a way of making ffmpeg start with the largest folders first ? [closed]

    5 décembre 2024, par tuqueque

    I have the following script (snippet), where I search for all the folders and then pass the result to xargs, where I run ffmpeg in parallel for as many threads as my CPU has (32) to convert the image sequences inside each folder :

    


    find . -type d -print0 | xargs -0 -P32 -I{} ffmpeg...


    


    The thing is that there are folders with just 10 images inside and other folders with 150 images... I'd love if ffmpeg started with the largest folders first and then continue in a descending order... not one at a time, but keeping the parallel nature of the script.

    


    Is that even possible without radically altering the one-line script I have ?

    


  • ffmpeg got black and white video when encoding flv

    17 décembre 2012, par samyou

    i searched the site and got a post :
    getting black and white image after encoding
    but i got no answer.

    don't know how but it is all black and white.

    hear is the init code :

    JNIEXPORT jboolean JNICALL Java_sam_flvmuxer_SamRTMPNative_nativeInitMuxerAndStart(
       JNIEnv *env, jclass jcls, jstring outfile, jint inwidth, jint inheight,
       jint fps) {
    audioOutBuffer = malloc(AUDIO_OUT_BUFFER_SIZE);
    videoOutBuffer = malloc(VIDEO_OUT_BUFFER_SIZE);
    VIDEO_WIDTH = inwidth;
    VIDEO_HEIGHT = inheight;
    av_log_set_callback(samffmpeglogback);
    av_register_all();
    char *filepath = (*env)->GetStringUTFChars(env, outfile, 0);
    JNILOG("file path is %s",filepath);
    avformat_alloc_output_context2(&avFormatContext, NULL, NULL, filepath);
    if (!avFormatContext) {
       JNILOG("avformat_alloc_output_context2 with filepath failed");
       return JNI_FALSE;
    }
    AVOutputFormat *fmt = avFormatContext->oformat;
    //fmt->video_codec = VIDEO_CODEC_ID;
    ////init video
    avVideoStream = avformat_new_stream(avFormatContext, NULL );
    if (!avVideoStream) {
       (*env)->ReleaseStringUTFChars(env, outfile, filepath);
       return 0;
    }

    AVCodec *videocodec = avcodec_find_encoder(VIDEO_CODEC_ID);
    if (!videocodec) {
       JNILOG("avcodec_find_encoder error");
       return JNI_FALSE;
    }
    avcodec_get_context_defaults3(avVideoStream->codec, videocodec);

    AVCodecContext *avVideoCodecContext = avVideoStream->codec;
    avVideoCodecContext->codec_type = AVMEDIA_TYPE_VIDEO;
    avVideoCodecContext->codec_id = VIDEO_CODEC_ID;
    avVideoCodecContext->width = inwidth;
    avVideoCodecContext->height = inheight;
    avVideoCodecContext->time_base.den = fps;
    avVideoCodecContext->time_base.num = 1;
    avVideoCodecContext->gop_size = 10;
    avVideoCodecContext->pix_fmt = PIX_FMT_YUV420P;
    JNILOG("bitrate befort set = %d",avVideoCodecContext->bit_rate);
    avVideoCodecContext->bit_rate = 600000;


    if (fmt->flags & AVFMT_GLOBALHEADER)
       avVideoCodecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;

    av_dump_format(avFormatContext,0,filepath,1);
    if(avcodec_open2(avVideoCodecContext,videocodec,NULL)<0)
    {
       JNILOG("video avcodec_open2 failed");
       (*env)->ReleaseStringUTFChars(env, outfile, filepath);
       return JNI_FALSE;
    }



    ///////
    /* open the output file, if needed */
    if (!(fmt->flags & AVFMT_NOFILE)) {
       if ((avio_open(&avFormatContext->pb, filepath, AVIO_FLAG_WRITE)) < 0) {
           JNILOG("Could not open file!");
           (*env)->ReleaseStringUTFChars(env, outfile, filepath);
           return 0;
       }
    }
    if (avformat_write_header(avFormatContext, NULL ) < 0) {
       JNILOG("Could not avformat_write_header!");
       (*env)->ReleaseStringUTFChars(env, outfile, filepath);
       return 0;
    }
    (*env)->ReleaseStringUTFChars(env, outfile, filepath);
    YUVFrame = avcodec_alloc_frame();
    JNILOG("ffmpeg every thing inited");
    return JNI_TRUE;
    }

    and encode code looks like below :

    avpicture_fill((AVPicture *)YUVFrame,framedata,PIX_FMT_YUV420P,VIDEO_WIDTH,VIDEO_HEIGHT);
       ///打印data 分量!!!!
       AVPacket pkt;
       av_init_packet(&pkt);
       pkt.data = videoOutBuffer;
       pkt.size = VIDEO_OUT_BUFFER_SIZE;
       int gotpkt = 0;
       avcodec_encode_video2(avVideoStream->codec,&pkt,YUVFrame,&gotpkt);
       if (gotpkt > 0) {
           JNILOG("encoded size=%d,gotpktflag=%d",pkt.size,gotpkt);
           pkt.stream_index = avVideoStream->index;
           pkt.flags |= AV_PKT_FLAG_KEY;
           pkt.pts = timestamp;
           while (pkt.pts <= lastVideoPts) {
               pkt.pts++;
           }
           lastVideoPts = pkt.pts;
           if (av_interleaved_write_frame(avFormatContext, &pkt) < 0) {
               JNILOG("av_interleaved_write_frame failed");
           }
       }

    someone please help me with this problem^^