Recherche avancée

Médias (0)

Mot : - Tags -/images

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Le plugin : Gestion de la mutualisation

    2 mars 2010, par

    Le plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
    Installation basique
    On installe les fichiers de SPIP sur le serveur.
    On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
    On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
    < ?php (...)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (7806)

  • Calling ffmpeg.c's main twice causes app crash

    21 octobre 2018, par user924

    Using FFmpeg 4.0.2 and call its ffmpeg.c's main function twice causes Android app crash (using FFmpeg shared libs and JNI)

    A/libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 20153

    Though it works ok for FFmpeg 3.2.5

    FFmpeg 4.0.2 main

    int main(int argc, char **argv) {
       int i, ret;
       int64_t ti;

       init_dynload();

       register_exit(ffmpeg_cleanup);

       setvbuf(stderr,NULL,_IONBF,0); /* win32 runtime needs this */

       av_log_set_flags(AV_LOG_SKIP_REPEATED);
       parse_loglevel(argc, argv, options);

       if(argc>1 &amp;&amp; !strcmp(argv[1], "-d")){
           run_as_daemon=1;
           av_log_set_callback(log_callback_null);
           argc--;
           argv++;
       }

    #if CONFIG_AVDEVICE
       avdevice_register_all();
    #endif
       avformat_network_init();

       show_banner(argc, argv, options);

       /* parse options and open all input/output files */
       ret = ffmpeg_parse_options(argc, argv);
       if (ret &lt; 0)
           exit_program(1);

       if (nb_output_files &lt;= 0 &amp;&amp; nb_input_files == 0) {
           show_usage();
           av_log(NULL, AV_LOG_WARNING, "Use -h to get full help or, even better, run 'man %s'\n", program_name);
           exit_program(1);
       }

       /* file converter / grab */
       if (nb_output_files &lt;= 0) {
           av_log(NULL, AV_LOG_FATAL, "At least one output file must be specified\n");
           exit_program(1);
       }

    //     if (nb_input_files == 0) {
    //         av_log(NULL, AV_LOG_FATAL, "At least one input file must be specified\n");
    //         exit_program(1);
    //     }

       for (i = 0; i &lt; nb_output_files; i++) {
           if (strcmp(output_files[i]->ctx->oformat->name, "rtp"))
               want_sdp = 0;
       }

       current_time = ti = getutime();
       if (transcode() &lt; 0)
           exit_program(1);
       ti = getutime() - ti;
       if (do_benchmark) {
           av_log(NULL, AV_LOG_INFO, "bench: utime=%0.3fs\n", ti / 1000000.0);
       }
       av_log(NULL, AV_LOG_DEBUG, "%"PRIu64" frames successfully decoded, %"PRIu64" decoding errors\n",
              decode_error_stat[0], decode_error_stat[1]);
       if ((decode_error_stat[0] + decode_error_stat[1]) * max_error_rate &lt; decode_error_stat[1])
           exit_program(69);

       ffmpeg_cleanup(received_nb_signals ? 255 : main_return_code);
       return main_return_code;
    }

    FFmpeg 3.2.5 main

    int main(int argc, char **argv) {
       av_log(NULL, AV_LOG_WARNING, " Command start");

       int i, ret;
       int64_t ti;
       init_dynload();

       register_exit(ffmpeg_cleanup);

       setvbuf(stderr, NULL, _IONBF, 0); /* win32 runtime needs this */

       av_log_set_flags(AV_LOG_SKIP_REPEATED);
       parse_loglevel(argc, argv, options);

       if (argc > 1 &amp;&amp; !strcmp(argv[1], "-d")) {
           run_as_daemon = 1;
           av_log_set_callback(log_callback_null);
           argc--;
           argv++;
       }

       avcodec_register_all();
    #if CONFIG_AVDEVICE
       avdevice_register_all();
    #endif
       avfilter_register_all();
       av_register_all();
       avformat_network_init();

       av_log(NULL, AV_LOG_WARNING, " Register to complete the codec");

       show_banner(argc, argv, options);

       /* parse options and open all input/output files */
       ret = ffmpeg_parse_options(argc, argv);
       if (ret &lt; 0)
           exit_program(1);

       if (nb_output_files &lt;= 0 &amp;&amp; nb_input_files == 0) {
           show_usage();
           av_log(NULL, AV_LOG_WARNING, "Use -h to get full help or, even better, run 'man %s'\n",
                  program_name);
           exit_program(1);
       }

       /* file converter / grab */
       if (nb_output_files &lt;= 0) {
           av_log(NULL, AV_LOG_FATAL, "At least one output file must be specified\n");
           exit_program(1);
       }

    //     if (nb_input_files == 0) {
    //         av_log(NULL, AV_LOG_FATAL, "At least one input file must be specified\n");
    //         exit_program(1);
    //     }

       for (i = 0; i &lt; nb_output_files; i++) {
           if (strcmp(output_files[i]->ctx->oformat->name, "rtp"))
               want_sdp = 0;
       }

       current_time = ti = getutime();
       if (transcode() &lt; 0)
           exit_program(1);
       ti = getutime() - ti;
       if (do_benchmark) {
           av_log(NULL, AV_LOG_INFO, "bench: utime=%0.3fs\n", ti / 1000000.0);
       }
       av_log(NULL, AV_LOG_DEBUG, "%"PRIu64" frames successfully decoded, %"PRIu64" decoding errors\n",
              decode_error_stat[0], decode_error_stat[1]);
       if ((decode_error_stat[0] + decode_error_stat[1]) * max_error_rate &lt; decode_error_stat[1])
           exit_program(69);

       exit_program(received_nb_signals ? 255 : main_return_code);

       nb_filtergraphs = 0;
       nb_input_streams = 0;
       nb_input_files = 0;
       progress_avio = NULL;


       input_streams = NULL;
       nb_input_streams = 0;
       input_files = NULL;
       nb_input_files = 0;

       output_streams = NULL;
       nb_output_streams = 0;
       output_files = NULL;
       nb_output_files = 0;

       return main_return_code;
    }

    So what could be issue ? It seems FFmpeg 4.0.2 doesn’t release something (resources or its static variables to initial values after the first command)

  • How to synchronize audio and video using FFmpeg from 2 different input and stream it over the network via RTP, in c++ ?

    5 novembre 2018, par ElPablo

    I am currently trying to develop an app in C++ that perform all of this :

    • Capture Video of the desktop
    • Capture Audio of the desktop
    • Video & Audio processing
    • Stream Audio & Video to another computer

    For this I am using OpenCV and FFmpeg libraries.

    I succeed to capture the video, with openCV, convert it in an AVFrame, encoding the frame and send it over the network with FFmpeg.

    For the audio, I also succeed (with the help of the FFmpeg documentation, transcode_aac.c) to capture the audio of my audio card, decoding, convert, encoding and send it over the network.

    Then I go to my other computer, and I read the 2 Streams with FFplay :

    ffplay -loglevel repeat+level+verbose -probesize 32 -sync ext -i config.sdp -protocol_whitelist file,udp,rtp

    It works, I have the video and the audio .. but .. The sound is not at all synchronize with the video, it is like about 3 sec later.

    My code is like this :

    I am using, 3 AVFormatContext :

    • audio input
    • video output
    • audio output

    I did that because RTP can only take one stream, so I had to separate Audio and Video.

    So basically, I have 2 input and I need 2 output.

    I know how to do that in command line with FFmpeg (and it works it is synchronize) but I have no idea how to do that and synchronize the streams in C++.

    My guesses are :

    • I have to play with time_base attribute of the packets during
      encoding => but how can I synchronize packet from two different
      AVStream and AVFormatContext ?
    • Do I have to set the time_base attribute of the output audio with the
      input audio or with my 30 FPS that I want ? Same question with output
      Video

    Further information :

    • The video is captured using this
      OPENCV Desktop Capture
      then convert with this function sws_scale() into an AVFrame

    • I am using 4 Thread (Video Capture, Video processing, Audio Decoding,
      Audio processing)

    So guys, if you have any ideas how to synchronize audio and video, or other tips that can help me, it will be with pleasure.

    Thx

  • How To Get FFMPEG To Continuously Overwrite/Append Audio File ?

    28 octobre 2018, par GPinskiy

    I installed an external audio card onto my Raspberry Pi 3 and I want to Chromecast the recorded audio. I have set up a Node.js server to cast things and have set up the sound card using alsamixer. I can correctly hear the line-in on the headphones when I use the command arecord -D hw:0,0 -r 48000 -f S32_LE -c 2 | aplay -D dmix:CARD=audioinjectorpi,DEV=0 -r 48000 -f S32_LE -c 2 to simulate pass-through.

    The last step is to actually expose this stream in a way that the Chromecast can access. The Chromecast can’t use a RDP stream or anything similar, only files. So I thought that I could get away with having FFMPEG create an mp3 file that it continuously appends to while dropping the last x seconds so that the overall length of the mp3 file is only like 20 seconds and having a local web server that the Chromecast can get that file from.

    I see that there is a way to automatically segment in FFMPEG but that create a bunch of separate 20 sec files rather than a single file that is 20 secs. What would be the correct way of doing this ?