Recherche avancée

Médias (91)

Autres articles (38)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

Sur d’autres sites (7132)

  • FFMPEG rtmp live stream with android fps is too low

    27 novembre 2015, par 蔡明原

    I’m interesting live stream from android to youtube.
    I have compile ffmpeg and write jni to live stream with android.
    but the fps is too low, it always has 3 5 fps, the user view is not good.
    how can i do to change the fps.
    this is my jni code with ffmpeg init.

    AVCodecContext *video_codec_ctx = video_stream->codec;
     video_codec_ctx->codec_id = video_codec->id;
     video_codec_ctx->codec_type = AVMEDIA_TYPE_VIDEO;
     video_codec_ctx->level = 31;

     video_codec_ctx->width = width;
     video_codec_ctx->height = height;
     video_codec_ctx->pix_fmt = PIX_FMT_YUV420P;

     video_codec_ctx->rc_max_rate = 0;
     video_codec_ctx->rc_buffer_size = 0;
     video_codec_ctx->gop_size = 30;
     video_codec_ctx->max_b_frames = 0;
     video_codec_ctx->slices = 8;  
     video_codec_ctx->b_frame_strategy = 1;
     video_codec_ctx->coder_type = 0;
     video_codec_ctx->me_cmp = 1;
     video_codec_ctx->me_range = 16;
     video_codec_ctx->qmin = 10;
     video_codec_ctx->qmax = 51;
     video_codec_ctx->keyint_min = 25;
     video_codec_ctx->refs= 3;  
     video_codec_ctx->trellis = 0;
     video_codec_ctx->scenechange_threshold = 40;
     video_codec_ctx->flags |= CODEC_FLAG_LOOP_FILTER;
     video_codec_ctx->me_method = ME_HEX;
     video_codec_ctx->me_subpel_quality = 6;
     video_codec_ctx->i_quant_factor = 0.71;
     video_codec_ctx->qcompress = 0.6;
     video_codec_ctx->max_qdiff = 4;
     video_codec_ctx->time_base.den = 30;
     video_codec_ctx->time_base.num = 1;
     video_codec_ctx->bit_rate = 5200*1000;
     video_codec_ctx->bit_rate_tolerance = 0;
     video_codec_ctx->flags2 |= 0x00000100;

     fmt_context->bit_rate = 6000*1000;

     av_opt_set(video_codec_ctx, "partitions", "i8x8,i4x4,p8x8,b8x8", 0);

     av_opt_set_int(video_codec_ctx, "direct-pred", 1, 0);
     av_opt_set_int(video_codec_ctx, "rc-lookahead", 0, 0);
     av_opt_set_int(video_codec_ctx, "fast-pskip", 1, 0);
     av_opt_set_int(video_codec_ctx, "mixed-refs", 1, 0);
     av_opt_set_int(video_codec_ctx, "8x8dct", 0, 0);
     av_opt_set_int(video_codec_ctx, "weightb", 0, 0);

     if(fmt_context->oformat->flags & AVFMT_GLOBALHEADER)
         video_codec_ctx->flags |= CODEC_FLAG_GLOBAL_HEADER;

     LOGI("Opening video codec");
     AVDictionary *vopts = NULL;
     av_dict_set(&vopts, "profile", "main", 0);
     //av_dict_set(&vopts, "vprofile", "main", 0);
     av_dict_set(&vopts, "rc-lookahead", 0, 0);
     av_dict_set(&vopts, "tune", "film", 0);  
     av_dict_set(&vopts, "preset", "ultrafast", 0);  
     av_opt_set(video_codec_ctx->priv_data,"tune","film",0);
     av_opt_set(video_codec_ctx->priv_data,"preset","ultrafast",0);
     av_opt_set(video_codec_ctx->priv_data,"tune","film",0);
     int open_res = avcodec_open2(video_codec_ctx, video_codec, &vopts);
  • Syncronising Video/audio with two different sources

    22 mai 2016, par Indy

    I have an issue with an very old film that I’m trying to recover. I have two source of the same film.

    Source 1 (from ffmeg -i) :

    Duration: 02:43:35.77, start: 0.000000, bitrate: 1342 kb/s
    Stream #0:0: Video: mpeg4 (Simple Profile) (DIVX / 0x58564944), yuv420p, 720x480 [SAR 1:1 DAR 3:2], 1200 kb/s, 29.97 fps, 29.97 tbr, 29.97 tbn, 30k tbc
    Stream #0:1: Audio: mp3 (U[0][0][0] / 0x0055), 48000 Hz, stereo, s16p, 128 kb/s

    Source 2 (from ffmeg -i) :

    Duration: 02:46:11.04, start: 0.000000, bitrate: 1625 kb/s
    Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1200x720 [SAR 1:1 DAR 5:3], 1431 kb/s, 30 fps, 30 tbr, 30 tbn, 60 tbc (default)
    Metadata:
     handler_name    : VideoHandler
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 191 kb/s (default)
    Metadata:
     creation_time   : 2014-06-22 12:06:22
     handler_name    : IsoMedia File Produced by Google, 5-11-2011

    What I would like is to take the audio from source 1 and video from source 2 and create a new file with them.

    I have tried changing the source 1 to 30 fps and then using its audio with source 2’s video but the audio ’fps’ is really after sync after sometime. Perhaps my approach is wrong.

    what’s the best way to take the audio from source 1 synced with source 2’s video ?

    Any help is appreciated. Thanks,
    Indy

  • Synchronizing Video/audio with two different sources

    3 mai 2017, par Indy

    I have an issue with an very old film that I’m trying to recover. I have two source of the same film.

    Source 1 (from ffmeg -i) :

    Duration: 02:43:35.77, start: 0.000000, bitrate: 1342 kb/s
    Stream #0:0: Video: mpeg4 (Simple Profile) (DIVX / 0x58564944), yuv420p, 720x480 [SAR 1:1 DAR 3:2], 1200 kb/s, 29.97 fps, 29.97 tbr, 29.97 tbn, 30k tbc
    Stream #0:1: Audio: mp3 (U[0][0][0] / 0x0055), 48000 Hz, stereo, s16p, 128 kb/s

    Source 2 (from ffmeg -i) :

    Duration: 02:46:11.04, start: 0.000000, bitrate: 1625 kb/s
    Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1200x720 [SAR 1:1 DAR 5:3], 1431 kb/s, 30 fps, 30 tbr, 30 tbn, 60 tbc (default)
    Metadata:
     handler_name    : VideoHandler
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 191 kb/s (default)
    Metadata:
     creation_time   : 2014-06-22 12:06:22
     handler_name    : IsoMedia File Produced by Google, 5-11-2011

    What I would like is to take the audio from source 1 and video from source 2 and create a new file with them.

    I have tried changing the source 1 to 30 fps and then using its audio with source 2’s video but the audio ’fps’ is really after sync after sometime. Perhaps my approach is wrong.

    what’s the best way to take the audio from source 1 synced with source 2’s video ?